Overview
LightGBM is often faster and uses less memory than XGBoost, especially on very large datasets.
Key Innovations
- Leaf-wise Tree Growth: Unlike most algorithms that grow trees level-by-level, LightGBM grows trees by choosing the leaf that reduces the loss the most. This can lead to deeper, more accurate trees.
- GOSS (Gradient-based One-Side Sampling): Keeps data points with large gradients and randomly samples those with small gradients, speeding up training without losing much accuracy.
- EFB (Exclusive Feature Bundling): Bundles mutually exclusive features to reduce the number of features.