autoclip icon indicating copy to clipboard operation
autoclip copied to clipboard

Adaptive Gradient Clipping

Results 6 autoclip issues
Sort by recently updated
recently updated
newest added

Has there been any research on how this strategy interacts with a learning rate schedule? Especially for something extreme like the one-cycle policy (super convergence). It seems like the history...

Gradient_transformers was removed from Tensorflow rendering the algorithm unusable. Is there any alternative ?

First off, thank you for the wonderful paper and technique. It solves some needs I have been having with a couple of NLP projects, amongst others. I'm making this pull...

Hello How are you? Thanks for contributing to this project. Could u provide an example of how to use in PyTorch? Thanks

Hello How are you? Thanks for contributing to this project. I made my own AutoClipper class based on your code. ![image](https://user-images.githubusercontent.com/47862419/143223240-53645f83-23e1-4c88-82ce-4cedc3c0cef9.png) Please check if there is any problem. Here I...

``` learning_rate=0.001, gradient_transformers=[AutoClipper(10)] File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/adam.py", line 115, in __init__ super(Adam, self).__init__(name, **kwargs) File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py", line 303, in __init__ "passed to optimizer: " + str(k)) ``` TypeError: Unexpected keyword argument passed...