keras
keras copied to clipboard
K.gradients: NotImplementedError
OK, I'm looking at the mxnet_backend.py and especially on how gradients are calculated since I'm developing a custom optimizer. However K.gradients do not implement any kind of call to mx:
def gradients(loss, variables):
"""Returns the gradients of `variables` (list of tensor variables)
with regard to `loss`.
"""
raise NotImplementedError
So my question is, how are gradients actually calculated, say in a call like:
grads=K.gradients(loss, params)
in my optimizer?
I mean, how the heck does even SGD work with the mx backend?
Thanks