The Power of Scale for Parameter-Efficient Prompt Tuning, Brian Lester, Rami Al-Rfou, Noah Constant, 2021Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP) (Association for Computational Linguistics)DOI: 10.18653/v1/2021.emnlp-main.243 - Introduces Prompt Tuning, demonstrating its effectiveness for adapting large language models with minimal trainable parameters.
Prefix-Tuning: Optimizing Continuous Prompts for Generation, Xiang Lisa Li, Percy Liang, 2021Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Vol. 1 (Association for Computational Linguistics)DOI: 10.18653/v1/2021.acl-long.353 - Presents Prefix Tuning, a method for conditioning language models by adding trainable prefixes to attention layers, particularly effective for generation tasks.