Visualization of Optimizers.
Our supporting optimizers.
From GD
to NAdam
are referenced (arxiv 1609.04747 - Understanding Gradient Descent Algorithm)
- GD (Gradient Descent)
- SGD (Stochastic Gradient Descent)
- Momentum
- NAG (Nesterov accelerated gradient)
- Adagrad
- AdaDelta
- RMSProp
- Adam
- AdaMax
- NAdam
- RAdam (arxiv 1908.03265 - On the Variance of the Adaptive Learning Rate and Beyond)
If you want any more optimizers to add here, please write on issues tab.
Test on GUI is available.
Open newoptimizer.fig
with MATLAB GUIDE
and run.