optax
optax copied to clipboard
Adds Eve Optimizer
The Eve Optimizer was proposed by Hayashi et al in 2018 (https://arxiv.org/abs/1611.01505) and is a simple framework applicable to adaptive gradient optimizers, but is specifically applied to Adam in the paper. This Adam-based algorithm is what is implemented in this fork. However, there is room for future improvement with implementing eve as a wrapper that adds the global learning rate scaling as a chainable scale_by method to any arbitrary optimizer
Apologies for the long delay, would you mind moving this to contrib/
?
Also would you consider implementing the wrapper version instead?
happy to help out with the wrapper version if there's still interest!
Hello,
Sorry for the delay. I am not working on this anymore, so feel free to close the pr or take up the addition, whichever you prefe.
Thanks, Wesley
On Fri, Feb 2, 2024 at 7:27 PM Amos You @.***> wrote:
happy to help out with the wrapper version if there's still interest!
— Reply to this email directly, view it on GitHub https://github.com/google-deepmind/optax/pull/475#issuecomment-1924999255, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQ2UBRD2TGMUAG55EKE3B53YRWHAZAVCNFSM6AAAAAAUFKQTBOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRUHE4TSMRVGU . You are receiving this because you authored the thread.Message ID: @.***>
@amosyou : feel free to open a new PR if you want to take over this!