The Vapnik-Chervonenkis dimension measures the capacity of a hypothesis class — the largest set of points it can shatter (correctly classify in all 2ⁿ labelings).
4
0 / 16
What is shattering? A hypothesis class H shatters a set S if for every possible ±1 labeling of S, there exists some h∈H that correctly classifies all points. The VC dimension is the size of the largest set that can be shattered.
Fundamental theorem: For a class with VC dimension d, the number of training examples needed for PAC learning is Θ(d/ε · log(1/δ)). More capacity → more data needed.
Interactive: Drag the slider to cycle through all 2ⁿ labelings and see which ones the current hypothesis class can realize.