Hugging Face Transformers Documentation, Hugging Face, 2024 - Provides comprehensive guides and API references for using the transformers library, including Trainer, TrainingArguments, and AutoModelForCausalLM, which are central to the fine-tuning workflow presented.
Qwen: A Super Large-Scale Pretrained Language Model Family with Strong Chinese and English Abilities, Jinze Bai, Shuai Bai, Yunfei Chu, Zeyu Cui, Kai Dang, Xiaodong Deng, Yang Fan, Wenbin Ge, Yu Han, Fei Huang, Binyuan Hui, Luo Ji, Mei Li, Junyang Lin, Runji Lin, Dayiheng Liu, Gao Liu, Chengqiang Lu, Keming Lu, Jianxin Ma, Rui Men, Xingzhang Ren, Xuancheng Ren, Chuanqi Tan, Sinan Tan, Jianhong Tu, Peng Wang, Shijie Wang, Wei Wang, Shengguang Wu, Benfeng Xu, Jin Xu, An Yang, Hao Yang, Jian Yang, Shusheng Yang, Yang Yao, Bowen Yu, Hongyi Yuan, Zheng Yuan, Jianwei Zhang, Xingxuan Zhang, Yichang Zhang, Zhenru Zhang, Chang Zhou, Jingren Zhou, Xiaohuan Zhou, Tianhang Zhu, 2023arXiv preprint arXiv:2309.16609DOI: 10.48550/arXiv.2309.16609 - Describes the Qwen model family, including the architecture and training methodology, which is the base model used for fine-tuning in this section.