You have seen how gradient boosting models excel at standard regression and classification problems. However, their flexibility extends to more specialized machine learning tasks. This chapter demonstrates how to apply and adapt gradient boosting frameworks to solve problems beyond the typical supervised learning setup.
You will learn about:
rank:pairwise
.We will examine the necessary modifications to the objective functions and configurations required to effectively use XGBoost, LightGBM, or CatBoost for these specific applications. The chapter concludes with a practical implementation of a learning-to-rank task.
9.1 Learning to Rank with Gradient Boosting
9.2 Ranking Objective Functions (Pairwise, Listwise)
9.3 Survival Analysis with Gradient Boosting
9.4 Survival Objective Functions (Cox PH)
9.5 Quantile Regression with Gradient Boosting
9.6 Quantile Loss Function Implementation
9.7 Multi-Output Gradient Boosting
9.8 Practice: Implementing Ranking with XGBoost
© 2025 ApX Machine Learning