deep-learning-with-python-notebooks
deep-learning-with-python-notebooks copied to clipboard
Use of K.update_add leads to NoneType in K.gradients
I tried the code related to Deep Dream and ran into this warning:
WARNING:tensorflow:Variable += will be deprecated. Use variable.assign_add if you want assignment to the variable value or 'x = x + y' if you want a new python Tensor object. In [ ]:
This warning is trigged by this: loss += coeff * K.sum(K.square(activation[:, 2: -2, 2: -2, :])) / scaling
So I tried to use K.update_add() instead. But after that, "grads = K.gradients(loss, dream)[0]" will give NoneType. I googled and seem to hear that a non-differentiable "loss" can result in this. So I am thinking maybe .update_add() somehow causes this. I switch back to use "+=" and K.gradients() is returning the correct thing.
Is this a bug in Keras?
Hey
I would suggest assigning " coeff * K.sum(K.square(activation[:, 2: -2, 2: -2, :])) / scaling" to a variable and then putting into the "loss += variable".
Best, Eric
On Tue, Mar 13, 2018 at 5:22 AM, kechan [email protected] wrote:
I tried the code related to Deep Dream and ran into this warning:
WARNING:tensorflow:Variable += will be deprecated. Use variable.assign_add if you want assignment to the variable value or 'x = x + y' if you want a new python Tensor object. In [ ]:
This warning is trigged by this: loss += coeff * K.sum(K.square(activation[:, 2: -2, 2: -2, :])) / scaling
So I tried to use K.update_add() instead. But after that, "grads = K.gradients(loss, dream)[0]" will give NoneType. I googled and seem to hear that a non-differentiable "loss" can result in this. So I am thinking maybe .update_add() somehow causes this. I switch back to use "+=" and K.gradients() is returning the correct thing.
Is this a bug in Keras?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/fchollet/deep-learning-with-python-notebooks/issues/43, or mute the thread https://github.com/notifications/unsubscribe-auth/Af0qTiYIOG9CVla7O91ctFDuwX_ZjkwAks5tdvUegaJpZM4SnyRy .
-- Eric Lee University of Maryland, College Park 2016 Robert H. Smith School of Business Finance 0961 482 064 [email protected] [email protected] https://www.linkedin.com/in/eryklee/
@EricLee0000
This is great for code readability and maintenance, but does this solve the real issue? This code was given as is from Chollet's book.
The +=
operator has seen been deprecated, so using the code as is now produces an error rather than just a warning. It works if you use the x = x + y
form as suggested in the error/warning message, specifically:
loss = loss + coeff * K.sum(K.square(activation[:, 2: -2, 2: -2, :])) / scaling
Note that there is a subtle semantic distinction between x += y
and x = x + y
, namely that the former should update the existing object assigned to x, whereas the latter should create a new object and assign it to x.