keras-normalized-optimizers
keras-normalized-optimizers copied to clipboard
Not an error
Hello, this wrapper is a good idea.
We should also add the possibility to add gradient masks during update call which will be useful for pruning neural networks.
Best, Tom.
Yes that is quite possible. I was thinking of adding gradient clipping eventually.
The codebase is quite simple, could you try to add gradient masking ?
@tchaton I've refactored the codebase to allow extending a base wrapper called OptimizerWrapper which will handle most of the heavy lifting.
Now, NormalizedOptimizer and ClippedOptimizer extend that and we can add another class for gradient masking quite easily. The get_gradients call needs to be overridden, which will have the gradient masking code, and the get_config and from_config can be easily copy-pasted and modified.
Hey !
Awesome !!!! Here is my keras implementation for pruning : https://github.com/tchaton/Keras_Pruner Look at it if you have time. I have created a custom Conv2D_Masked that it is used to set up kernels to 0 https://github.com/tchaton/Keras_Pruner/blob/master/nvidia_pruning/wrappers.py It would be nice to have more support for this kind of things. I have tried this algo on several models, and I got very interesting results. On an extremely simple case ( binary classes with 10k images, It was able to prune a 60k params neural networks by 98.5 %, leaving only one kernel in the first layer, allowing perfect classification). It is part of my research on Explainable AI.
Best Regards, Thomas Chaton.
2018-06-09 20:19 GMT+00:00 Somshubra Majumdar [email protected]:
@tchaton https://github.com/tchaton I've refactored the codebase to allow extending a base wrapper called OptimizerWrapper which will handle most of the heavy lifting.
Now, NormalizedOptimizer and ClippedOptimizer extend that and we can add another class for gradient masking quite easily. The get_gradients call needs to be overridden, which will have the gradient masking code, and the get_config and from_config can be easily copy-pasted and modified.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/titu1994/keras-normalized-optimizers/issues/1#issuecomment-395995903, or mute the thread https://github.com/notifications/unsubscribe-auth/AMRCHRdFxeklCMl-r_4A0Y3vi6sTiivOks5t7C3MgaJpZM4UhW5m .
That looks like some great work, but this wrapper is over an Optimizer, and not a Layer. Therefore I don't think it's support can be included in a similar manner.
Yeah, I know. I didn t think about doing a wrapper over the optimizer. But it appears to work better using layer wrapper.
2018-06-10 19:00 GMT+00:00 Somshubra Majumdar [email protected]:
That looks like some great work, but this wrapper is over an Optimizer, and not a Layer. Therefore I don't think it's support can be included in a similar manner.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/titu1994/keras-normalized-optimizers/issues/1#issuecomment-396073034, or mute the thread https://github.com/notifications/unsubscribe-auth/AMRCHeMhF8dk5XN7D-hbGebxsfCkGnx6ks5t7WzYgaJpZM4UhW5m .
Hey !
I added on facebook. If you want to send me your resume, I could forward it for AI Research positions ;)
Best, Tom
2018-06-10 19:21 GMT+00:00 thomas chaton [email protected]:
Yeah, I know. I didn t think about doing a wrapper over the optimizer. But it appears to work better using layer wrapper.
2018-06-10 19:00 GMT+00:00 Somshubra Majumdar [email protected]:
That looks like some great work, but this wrapper is over an Optimizer, and not a Layer. Therefore I don't think it's support can be included in a similar manner.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/titu1994/keras-normalized-optimizers/issues/1#issuecomment-396073034, or mute the thread https://github.com/notifications/unsubscribe-auth/AMRCHeMhF8dk5XN7D-hbGebxsfCkGnx6ks5t7WzYgaJpZM4UhW5m .