Parameter-Efficient Transfer Learning for NLP, Neil Houlsby, Andrei Giurgiu, Stanislaw Jastrzebski, Bruna Morrone, Quentin De Laroussilhe, Andrea Gesmundo, Mona Attariyan, Sylvain Gelly, 2019Proceedings of the 36th International Conference on Machine Learning, Vol. 97 (PMLR)DOI: 10.5555/3305381.3305417 - Introduces the concept of adapter modules for parameter-efficient fine-tuning of pre-trained language models, detailing their architecture and early experiments.
Adapters - Adapter-Transformers Documentation, Hugging Face, 2023 (Hugging Face) - Official documentation for the adapter-transformers library, providing practical guides and API references for implementing and training adapter modules.