optimizer-visualization icon indicating copy to clipboard operation
optimizer-visualization copied to clipboard

Visualize Tensorflow's optimizers.

optimizer-visualization

Visualize gradient descent optimization algorithms in Tensorflow.

All methods start at the same location, specified by two variables. Both x and y variables are improved by the following Optimizers:

Adadelta documentation

Adagrad documentation

Adam documentation

Ftrl documentation

GD documentation

Momentum documentation

RMSProp documentation

For an overview of each gradient descent optimization algorithms, visit this helpful resource.

Numbers in figure legend indicate learning rate, specific to each Optimizer.

Note the optimizers' behavior when gradient is steep.

Note the optimizers' behavior when initial gradient is miniscule.

Inspired by the following GIFs:

From here