Flux.jl icon indicating copy to clipboard operation
Flux.jl copied to clipboard

Derivative of output neurons wrt to input neurons in loss

Open MaAl13 opened this issue 2 years ago • 3 comments

Hello, i want to define my custom loss function for a fully connected NN. In this loss i want to include the derivative of some output neurons wrt to some input nodes, is this in any way possible?

MaAl13 avatar Aug 27 '22 07:08 MaAl13

Are you looking for the derivative or the jacobian? Nesting AD is an option, so you can call ForwardDiff inside the loss

DhairyaLGandhi avatar Aug 27 '22 10:08 DhairyaLGandhi

Thank you so much for the response! I think it is atm only the derivative. So how would that Look then with ForwardDiff? How can i pass the explicit input neuron to the ForwardDiff?

MaAl13 avatar Aug 27 '22 13:08 MaAl13

If you could provide the actual formulation of the loss function (or much better, a working example in another language), we'd have something to work with. This type of question however should probably be asked on Discourse.

ToucheSir avatar Sep 07 '22 20:09 ToucheSir