optimizers topic
awesome-optimizers
Neural Network optimizers implemented from scratch in numpy (Adam, Adadelta, RMSProp, SGD, etc.)
flaxOptimizers
A collection of optimizers, some arcane others well known, for Flax.
annotated_deep_learning_paper_implementations
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gan...
Gradient-Centralization-TensorFlow
Instantly improve your training performance of TensorFlow models with just 2 lines of code!
keras-adamw
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
keras-radam
RAdam implemented in Keras & TensorFlow
axon
Nx-powered Neural Networks
frostnet
FrostNet: Towards Quantization-Aware Network Architecture Search
Gradient-Centralization
A New Optimization Technique for Deep Neural Networks
opytimizer
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.