Member-only story
Here’s a list of 100 technical facts about Decision Trees:
- Decision Trees are a recursive binary tree structure used for classification and regression.
- Each node in a Decision Tree represents a decision based on a specific feature.
- The root node is the topmost decision node, and leaf nodes contain the final decisions.
- Decision Trees use criteria like Gini impurity or entropy to determine the best split at each node.
- Gini impurity measures the probability of misclassifying a randomly chosen element.
- Entropy measures the level of disorder or impurity in a set of data.
- Decision Trees are prone to overfitting, capturing noise in the data.
- Pruning is a technique to prevent overfitting by removing branches that add little predictive power.
- Information Gain is used to measure the effectiveness of a feature in reducing impurity.
- Decision Trees can handle both categorical and numerical features.
- The CART algorithm (Classification and Regression Trees) is commonly used for Decision Trees.
- Regression Trees predict continuous values, while Classification Trees predict discrete class labels.