Gradient Descent
Loss landscape exploration · learning rate · momentum · Adam
Learning Rate:
0.05
Momentum β:
0.90
SGD
Momentum
Adam
Reset
Run / Pause
SGD
Momentum
Adam
Current
Step:
0
| Loss:
—
| Optimizer:
SGD
The loss surface has a
saddle point
at center and a
local minimum
at upper-right. The
global minimum
is at lower-left.