Decision Tree — Information Gain & Splitting
Recursive binary splitting using entropy and information gain
Controls
Points per class:
40
Max depth:
3
Min leaf size:
4
Regenerate Data
Step Split
Full Tree
Reset
Entropy:
H = -Σ pᵢ log₂(pᵢ)
Info Gain:
IG = H(parent) - Σ(|child|/|parent|)·H(child)
Best split maximizes information gain at each node.
Depth:
0
| Nodes:
1