Parameter-Efficient Transfer Learning for NLP, Neil Houlsby, Andrei Giurgiu, Stanislaw Swietojanski, Maciej G. Juszczak, Patrick H. Chen, Alireza Razavi, Gareth Griffiths, Anna W. Felbo, Hubert Simon, Marcin Mucha, Piotr Clark, Sebastian Hofmann, 2019Proceedings of the 36th International Conference on Machine Learning (ICML), Vol. 97 (PMLR)DOI: 10.5555/3305380.3305459 - 提出了适配器微调,一种参数高效迁移学习的基础性添加方法。
Prefix-Tuning: Optimizing Continuous Prompts for Generation, Xiang Lisa Li, Percy Liang, 2021Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Vol. Volume 1: Long Papers (Association for Computational Linguistics)DOI: 10.18653/v1/2021.acl-long.353 - 介绍了前缀微调,一种将可训练连续向量预置到每个注意力层的添加方法。