Member-only story

100 Facts About Gradient Boosting

btd
6 min readNov 28, 2023

--

Here’s a list of 100 facts about Gradient Boosting:

  1. Gradient Boosting is an ensemble learning technique that builds a series of weak learners to create a strong learner.
  2. It belongs to the boosting family of algorithms.
  3. The algorithm was introduced by Jerome Friedman in 1999.
  4. Gradient Boosting can be used for both regression and classification tasks.
  5. Decision trees are commonly used as weak learners in Gradient Boosting.
  6. It builds trees sequentially, with each tree correcting errors made by the previous ones.
  7. The “gradient” in Gradient Boosting refers to the optimization of a loss function using gradient descent.
  8. The loss function guides the algorithm to minimize errors during training.
  9. Gradient Boosting is particularly effective when dealing with complex, non-linear relationships in data.
  10. It is a form of functional gradient descent.
  11. AdaBoost and Gradient Boosting are distinct algorithms, with the latter being more flexible and powerful.
  12. Popular implementations of Gradient Boosting include XGBoost, LightGBM, and CatBoost.
  13. The algorithm is less prone to overfitting compared to other…

--

--

btd
btd

No responses yet