Activation Functions
Output: —
Derivative: —
Visualize neural network activations layer by layer. See how different activation functions (ReLU, sigmoid, tanh, GELU, swish) transform signals, and watch the network's internal state with adjustable weights and biases.