addons icon indicating copy to clipboard operation
addons copied to clipboard

Do not use reduce_sum before returning to loss wrapper.

Open manuelbre opened this issue 4 years ago • 2 comments

Description

Brief Description of the PR:

Handle reduction with loss wrapper not with this function. Currently when using tfa.losses.SigmoidFocalCrossEntropy(reduction: str = tf.keras.losses.Reduction.NONE), the loss is still reduced by summing over the last axis. I would expect tfa.losses.SigmoidFocalCrossEntropy(reduction: str = tf.keras.losses.Reduction.NONE) to return a loss of the same shape as y_pred which is currently not the case.

Type of change

Checklist:

  • [x] I've properly formatted my code according to the guidelines
    • [ ] By running Black + Flake8
    • [ ] By running pre-commit hooks
  • [ ] This PR addresses an already submitted issue for TensorFlow Addons
  • [ ] I have made corresponding changes to the documentation
  • [ ] I have added tests that prove my fix is effective or that my feature works
  • [ ] This PR contains modifications to C++ custom-ops

How Has This Been Tested?

If you're adding a bugfix or new feature please describe the tests that you ran to verify your changes: *

manuelbre avatar Jul 30 '20 12:07 manuelbre

Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

:memo: Please visit https://cla.developers.google.com/ to sign.

Once you've signed (or fixed any issues), please reply here with @googlebot I signed it! and we'll verify it.


What to do if you already signed the CLA

Individual signers
Corporate signers

ℹ️ Googlers: Go here for more info.

googlebot avatar Jul 30 '20 12:07 googlebot

@aakashkumarnain @ssaishruthi

You are owners of some files modified in this pull request. Would you kindly review the changes whenever you have the time to? Thank you very much.