optax icon indicating copy to clipboard operation
optax copied to clipboard

Adds Eve Optimizer

Open wglao opened this issue 2 years ago • 4 comments

The Eve Optimizer was proposed by Hayashi et al in 2018 (https://arxiv.org/abs/1611.01505) and is a simple framework applicable to adaptive gradient optimizers, but is specifically applied to Adam in the paper. This Adam-based algorithm is what is implemented in this fork. However, there is room for future improvement with implementing eve as a wrapper that adds the global learning rate scaling as a chainable scale_by method to any arbitrary optimizer

wglao avatar Jan 24 '23 16:01 wglao

Apologies for the long delay, would you mind moving this to contrib/? Also would you consider implementing the wrapper version instead?

mtthss avatar Oct 10 '23 08:10 mtthss

happy to help out with the wrapper version if there's still interest!

amosyou avatar Feb 03 '24 01:02 amosyou

Hello,

Sorry for the delay. I am not working on this anymore, so feel free to close the pr or take up the addition, whichever you prefe.

Thanks, Wesley

On Fri, Feb 2, 2024 at 7:27 PM Amos You @.***> wrote:

happy to help out with the wrapper version if there's still interest!

— Reply to this email directly, view it on GitHub https://github.com/google-deepmind/optax/pull/475#issuecomment-1924999255, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQ2UBRD2TGMUAG55EKE3B53YRWHAZAVCNFSM6AAAAAAUFKQTBOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRUHE4TSMRVGU . You are receiving this because you authored the thread.Message ID: @.***>

wglao avatar Feb 03 '24 03:02 wglao

@amosyou : feel free to open a new PR if you want to take over this!

fabianp avatar Feb 03 '24 10:02 fabianp