← lab index

Neural Activation

Visualize neural network activations layer by layer. See how different activation functions (ReLU, sigmoid, tanh, GELU, swish) transform signals, and watch the network's internal state with adjustable weights and biases.

Activation Functions

Output:   Derivative:

Network Visualization