Mean Field Neural Network

Poole et al. 2016 / Yang & Schoenholz 2017: Signal propagation in infinitely-wide random networks passes through ordered, chaotic, and critical phases determined by (σ_w², σ_b²).

2.0
0.0
20
45°
Recursion: q^l = σ²_w ∫ Dz φ(√q^{l-1} z)² + σ²_b where φ = ReLU, Dz = standard Gaussian measure.
Correlation: c^l(x,x') = C^l_{12} / √(C^l_{11} C^l_{22}) tracks how two inputs' representations correlate through depth.
Phase diagram: Ordered (c→1, χ < 1): gradients vanish. Chaotic (c→0, χ > 1): gradients explode. Edge of chaos (χ = 1): critical line in (σ_w², σ_b²) space — maximum depth of trainable networks.
The critical line for ReLU: σ²_w = 2, σ²_b = 0 (exactly).