While algorithms like XGBoost provide significant performance gains over standard Gradient Boosting Machines, they can still face computational bottlenecks when dealing with very large datasets and high-dimensional feature spaces. This chapter introduces LightGBM, a framework engineered specifically to address these challenges, prioritizing training speed and memory efficiency without substantial compromises in accuracy.
You will study the core techniques that contribute to LightGBM's efficiency. We will cover:
The chapter will also guide you through the essential parameters of the LightGBM Python API and culminate in a practical exercise where you implement and train a LightGBM model.
5.1 Motivation: Addressing XGBoost's Limitations
5.2 Gradient-based One-Side Sampling (GOSS)
5.3 Exclusive Feature Bundling (EFB)
5.4 Histogram-Based Split Finding
5.5 Leaf-Wise Tree Growth
5.6 Optimized Categorical Feature Handling
5.7 LightGBM API: Parameters and Configuration
5.8 Hands-on Practical: Implementing LightGBM
© 2025 ApX Machine Learning