Member-only story

100 Facts About Decision Trees

btd
6 min readNov 28, 2023

--

Here’s a list of 100 technical facts about Decision Trees:

  1. Decision Trees are a recursive binary tree structure used for classification and regression.
  2. Each node in a Decision Tree represents a decision based on a specific feature.
  3. The root node is the topmost decision node, and leaf nodes contain the final decisions.
  4. Decision Trees use criteria like Gini impurity or entropy to determine the best split at each node.
  5. Gini impurity measures the probability of misclassifying a randomly chosen element.
  6. Entropy measures the level of disorder or impurity in a set of data.
  7. Decision Trees are prone to overfitting, capturing noise in the data.
  8. Pruning is a technique to prevent overfitting by removing branches that add little predictive power.
  9. Information Gain is used to measure the effectiveness of a feature in reducing impurity.
  10. Decision Trees can handle both categorical and numerical features.
  11. The CART algorithm (Classification and Regression Trees) is commonly used for Decision Trees.
  12. Regression Trees predict continuous values, while Classification Trees predict discrete class labels.

--

--

btd
btd

No responses yet