adam-optimizer topic

List adam-optimizer repositories

gradient-descent

5
Stars
2
Forks
Watchers

A research project on enhancing gradient optimization methods

Crowded-Valley---Results

174
Stars
19
Forks
Watchers

This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"

CS231n

50
Stars
9
Forks
Watchers

PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"

RAdam

2.5k
Stars
338
Forks
Watchers

On the Variance of the Adaptive Learning Rate and Beyond

deepnet

319
Stars
83
Forks
Watchers

Educational deep learning library in plain Numpy.

CS-F425_Deep-Learning

102
Stars
4
Forks
Watchers

CS F425 Deep Learning course at BITS Pilani (Goa Campus)

padam-tensorflow

51
Stars
7
Forks
Watchers

Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge

SVHN-CNN

62
Stars
33
Forks
Watchers

Google Street View House Number(SVHN) Dataset, and classifying them through CNN

AdasOptimizer

85
Stars
11
Forks
Watchers

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achiev...

Hypergradient_variants

16
Stars
0
Forks
Watchers

Improved Hypergradient optimizers, providing better generalization and faster convergence.