addons
addons copied to clipboard
Make GN and IN the same dtype behavior as BN or LN in mixed_precision
This PR makes GroupNorm and InstanceNorm the same dtype behavior as keras' BatchNorm or LayerNorm in mixed_precision.
Fixes #2550
Type of change
- [x] Bug fix
Checklist:
- [x] I've properly formatted my code according to the guidelines
- [x] By running Black + Flake8
- [x] By running pre-commit hooks
- [x] This PR addresses an already submitted issue for TensorFlow Addons
- [x] I have added tests that prove my fix is effective or that my feature works
How Has This Been Tested?
python -m pytest .\tensorflow_addons\layers\tests\normalizations_test.py
Add accuracy tests for modified GN by setting tf.keras.layers.LayerNormalization as baseline.
@smokrow
You are owner of some files modified in this pull request. Would you kindly review the changes whenever you have the time to? Thank you very much.
@shkarupa-alex Are you still interested in this?
@shkarupa-alex Are you still interested in this?
Yes, i'm interested. This PR https://github.com/tensorflow/tensorflow/pull/52217 fixes my previous issue https://github.com/tensorflow/addons/issues/2550
Just waiting to merge.
But it would be nice to have a workaround while waiting for that.
Thank you for your contribution. We sincerely apologize for any delay in reviewing, but TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down
Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP