Member-only story

Ensemble Learning Techniques: Bagging vs. Boosting — A Comparison

btd
3 min readNov 16, 2023

--

I. Bagging (Bootstrap Aggregating):

1. Definition:

Bagging is an ensemble learning technique that involves training multiple instances of a model on different subsets of the training data. These subsets are created through bootstrapping, a process of sampling with replacement. The final prediction is then determined by averaging (for regression problems) or voting (for classification problems) the predictions of each individual model.

2. Key Concepts:

  1. Bootstrapping: Randomly sampling data points with replacement to create multiple subsets for training.
  2. Parallel Training: Models are trained independently in parallel, making bagging suitable for parallel computing.
  3. Diversity: The goal is to introduce diversity among the models, reducing overfitting and improving generalization.

3. Random Forest:

A popular example of bagging is the Random Forest algorithm, which builds a collection of decision trees. Each tree is trained on a different bootstrap sample, and the final prediction is determined by aggregating the predictions of individual trees.

II. Boosting:

--

--

btd
btd

No responses yet