Prefix-Tuning: Optimizing Continuous Prompts for Generation, Xiang Lisa Li, Percy Liang, 2021Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (Association for Computational Linguistics)DOI: 10.18653/v1/2021.acl-long.353 - 介绍前缀微调的原始论文,这是一种参数高效的方法,通过在注意力层的输入前添加任务特定的连续向量来调节冻结的语言模型。