keras-contrib
keras-contrib copied to clipboard
Add AdaMod optimizer to keras-contrib
- What I did I copied the Adam optimizer from the main keras repo and modified it according to the Adamod paper.
- How I did it Mainly I added the exponential averaging of the learning rate after the gradients are divided by the 2nd moment, using the new beta_3 parameter. This exponential average is then used as an upper bound for this adaptive learning rate. The 1st and 2nd moment bias corrections had to be separated out into 2 different statements because the 1st moment term is applied after this upper bounding according to the Adamod paper. See this paper for further details.
- How you can verify it I added a unit test in the same fashion as the other unit tests in the optimizers directory.
This pull request fixes #531