Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks, Chelsea Finn, Pieter Abbeel, and Sergey Levine, 2017Proceedings of the 34th International Conference on Machine Learning (ICML), Vol. 70 (PMLR)DOI: 10.5591/978-1-57766-709-0_165 - This foundational paper introduces MAML, outlining the bilevel optimization structure that contributes to significant computational demands, especially for large models, a core open problem discussed.
Adaptive Rank-tuning for Parameter-Efficient Few-Shot Fine-tuning, Zhao, Xueliang, Fu, Tingchen, Tao, Chongyang, Yan, Rui, 2022Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (Association for Computational Linguistics)DOI: 10.18653/v1/2022.emnlp-main.123 - This work addresses the challenge of optimally combining meta-learning with PEFT methods by adaptively tuning PEFT configurations, directly relevant to the section's discussion on hybrid approaches for foundation models.
Learning to Learn for Global-Scale Robustness, Andrew Apostolopoulos, Apoorva Khetan, and Tal Galanti, 2021Proceedings of the 38th International Conference on Machine Learning (ICML), Vol. 139 (Proceedings of Machine Learning Research)DOI: 10.5555/3496155.3496229 - This research investigates meta-learning approaches to enhance model robustness against out-of-distribution task shifts, a key challenge for real-world application of meta-learned adaptation strategies.