pytorch_optimizer
pytorch_optimizer copied to clipboard
Plans for pytorch_optimizer v3
In pytorch-optimizer v3, loss function will be added. So, finally, the optimizer & lr scheduler & loss function are all in one package.
Feature
- [x] support at least 60 optimizers
- [x] support at least 10 objectives
- [x] support
bitsandbytes(& 4-bit optimizers)
Refactor
- [x] Organize utils
Docs
- [x] Organize documentation
- [x] Support contribution guide (implementation, test, etc...)
- [x] Add issue templates
- [x] Migrate to mkdocs
- [x] Create
Q&Apage - [ ] Benchmark on ImageNet
Test
- [x] Organize test cases
IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude's Variance Matters https://arxiv.org/abs/1903.12141