Prefix Tuning: Conditioning via Continuous Prefixes
Was this section helpful?
Prefix-Tuning: Optimizing Continuous Prompts for Generation, Xiang Lisa Li, Percy Liang, 2021Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (Association for Computational Linguistics)DOI: 10.18653/v1/2021.acl-long.353 - The original paper introducing Prefix Tuning, a parameter-efficient method that conditions a frozen language model by prepending task-specific continuous vectors to the input of attention layers.
The Power of Scale for Parameter-Efficient Prompt Tuning, Brian Lester, Rami Al-Rfou, Noah Constant, 2021Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP) (Association for Computational Linguistics)DOI: 10.18653/v1/2021.emnlp-main.243 - Presents Prompt Tuning, an even more parameter-efficient method that conditions large language models by prepending learnable embeddings only to the input sequence, offering a contrast to Prefix Tuning.