XGBoost: A Scalable Tree Boosting System, Tianqi Chen, Carlos Guestrin, 2016Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (ACM)DOI: 10.1145/2939672.2939785 - Provides the theoretical and practical foundations of XGBoost, including its architecture, regularization techniques, and key hyperparameters discussed in the section.
Random Search for Hyper-Parameter Optimization, James Bergstra and Yoshua Bengio, 2012Journal of Machine Learning Research, Vol. 13 (Journal of Machine Learning Research Editorial Board)DOI: 10.1162/JMLR.2012.13.issue.oct.281-305 - Introduces and empirically demonstrates the effectiveness of random search over grid search for hyperparameter optimization, making it highly relevant for the coarse tuning phase.
Optuna: A Next-generation Hyperparameter Optimization Framework, Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama, 2019Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD '19) (ACM)DOI: 10.1145/3292500.3330701 - Presents Optuna, an optimization framework that supports multi-stage tuning and Bayesian optimization, aligning with the fine-tuning phase described.
Parameter Tuning, XGBoost Developers, 2024 - Official documentation providing practical guidance on tuning XGBoost hyperparameters, including strategies, parameter interaction, and the use of early stopping.