UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning, Yuning Mao, Lambert Mathias, Rui Hou, Amjad Almahairi, Hao Ma, Jiawei Han, Wen-tau Yih, Madian Khabsa, 2022Annual Meeting of the Association for Computational Linguistics (ACL)DOI: 10.48550/arXiv.2110.07577 - Proposes a unified framework for combining various PEFT methods, including LoRA, Adapters, and Prefix-Tuning, supporting multi-PEFT strategies.