Member-only story
XGBoost is a popular and powerful gradient boosting library, and it comes with a variety of hyperparameters that can be tuned to optimize model performance. Here’s a list of some important XGBoost hyperparameters and a brief explanation of how to tune them:
I. 10 Most Common XGBoost Hyperparameters:
1. Learning Rate (eta
or learning_rate
):
- Description: Controls the contribution of each tree to the final prediction. Lower values make the algorithm more robust but require more trees.
- Tuning: Typically set between 0.01 and 0.3. Use a lower value for more conservative boosting.
2. Number of Trees (n_estimators
):
- Description: The number of boosting rounds or trees to build.
- Tuning: Generally, a higher number of trees improves performance, but it comes with increased computation time. Use cross-validation to find an optimal value.
3. Maximum Depth of a Tree (max_depth
):
- Description: Maximum depth of a tree, which controls the complexity of the individual trees.
- Tuning: Tune along with
min_child_weight
. Start with a small…