Decision Tree Ppt
Decision Tree Ppt
Decision trees mimic human decision-making processes, making them intuitive and easy to interpret.
Example:
Imagine you're deciding whether to play tennis based on weather conditions. A decision tree can help by splitting decisions based on
factors like Outlook, Temperature, Humidity, and Wind.
How Do Decision Trees Work?
Building a decision tree involves recursively splitting the dataset into subsets based on feature values. The goal is to create
homogeneous subsets where the target variable is consistent.
Classification Trees
Regression Trees
Other Variants
Splitting Criterion:
Characteristics:
1. Interpretability:
○ Easy to visualize and understand.
○ Decisions can be traced back through the tree.
2. No Need for Feature Scaling:
○ Works with both numerical and categorical data.
3. Handles Non-linear Relationships:
○ Can capture complex interactions between features.
4. Feature Selection:
○ Automatically selects the most significant features for splits.
Advantages and Disadvantages
Disadvantages
1. Overfitting:
○ Trees can become overly complex, capturing noise in the data.
○ Pruning and setting depth limits can mitigate this.
2. Instability:
○ Small changes in data can lead to different trees.
3. Bias Towards Features with More Levels:
○ Features with many unique values may dominate splits (mitigated by metrics like Gain Ratio).
4. Performance:
○ Can be less accurate compared to ensemble methods like Random Forests or Gradient Boosted Trees.