torch-optimizer -- collection of optimizers for Pytorch
jettify
Unofficial implementation of Switching from Adam to SGD optimization in PyTorch.
Mrpatekful