This chapter focuses on XGBoost (Extreme Gradient Boosting), building directly on the foundational gradient boosting concepts and regularization techniques discussed earlier. XGBoost is a popular and effective gradient boosting implementation, distinguished by several key improvements designed for better performance and accuracy.
We will examine the core components that make XGBoost effective:
By the end of this chapter, you will understand the technical details behind XGBoost and be prepared to implement it using its Python library, configuring its main parameters for practical tasks.
4.1 Motivation and Enhancements over GBM
4.2 The Regularized Learning Objective
4.3 Split Finding Algorithm: Exact Greedy
4.4 Split Finding Algorithm: Approximate Greedy
4.5 Sparsity-Aware Split Finding
4.6 System Optimizations: Cache Awareness and Parallelism
4.7 XGBoost API: Parameters and Configuration
4.8 Hands-on Practical: Implementing XGBoost
Ā© 2025 ApX Machine Learning