XGBoost: A Scalable Tree Boosting System, Tianqi Chen and Carlos Guestrin, 2016Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (ACM)DOI: 10.1145/2939672.2939785 - This paper introduces XGBoost and explains its underlying optimization principles, including the use of second-order Taylor expansion for optimizing generic differentiable loss functions, which is fundamental to custom objective implementation.
LightGBM: A Highly Efficient Gradient Boosting Decision Tree, Guolin Ke, Qi Meng, Thomas Finley, Taifeng Wang, Wei Chen, Weidong Ma, Qiwei Ye, and Tie-Yan Liu, 2017Advances in Neural Information Processing Systems 30 (NIPS 2017) (Curran Associates, Inc.) - Describes the LightGBM algorithm and its design choices, which allow for efficient training with custom objective functions based on gradient and Hessian information.
Custom Objective and Evaluation Metric, XGBoost Contributors, 2024 - The official XGBoost documentation providing practical guidance and examples for defining and integrating custom objective functions and evaluation metrics within the framework.