Optimizer-PyTorch
Optimizer-PyTorch copied to clipboard
Package of Optimizer implemented with PyTorch .
Optimizer-PyTorch
Package of Optimizer implemented with PyTorch .
Optimizer Lists
SGD: stochastic gradient descent
- https://github.com/pytorch/pytorch/blob/master/torch/optim/sgd.py
Adam: A Method for Stochastic Optimization
- https://arxiv.org/abs/1412.6980
- https://openreview.net/forum?id=ryQu7f-RZ
- https://github.com/pytorch/pytorch/blob/master/torch/optim/adam.py
Adabound: Adaptive Gradient Methods with Dynamic Bound of Learning Rate
- https://arxiv.org/abs/1902.09843
- https://openreview.net/forum?id=Bkg3g2R9FX
- https://github.com/Luolc/AdaBound
RAdam: On the Variance of the Adaptive Learning Rate and Beyond
- https://arxiv.org/abs/1908.03265
- https://github.com/LiyuanLucasLiu/RAdam
Lookahead: Lookahead Optimizer: k steps forward, 1 step back
- https://arxiv.org/abs/1907.08610
Optimistic
- https://github.com/bruno-31/diff-game/blob/master/optimizers.py
OptimAdam
- https://github.com/kojino/GAN-Convergence/blob/master/script/optimizer.py
OMD
- https://github.com/GauthierGidel/Variational-Inequality-GAN/blob/master/optim/omd.py
ExtraGradient
- https://github.com/GauthierGidel/Variational-Inequality-GAN/blob/master/optim/extragradient.py
STORM: STOchastic Recursive Momentum
- < Momentum-Based Variance Reduction in Non-Convex SGD >
- http://papers.nips.cc/paper/9659-momentum-based-variance-reduction-in-non-convex-sgd
- https://github.com/google-research/google-research/blob/master/storm_optimizer/storm_optimizer.py (TensorFlow)
Others
- https://ruder.io/optimizing-gradient-descent/index.html
- https://github.com/lifeiteng/Optimizers
- http://stanford.edu/~boyd/
- http://www.athenasc.com/nonlinbook.html