Flux.jl
Flux.jl copied to clipboard
Derivative of output neurons wrt to input neurons in loss
Hello, i want to define my custom loss function for a fully connected NN. In this loss i want to include the derivative of some output neurons wrt to some input nodes, is this in any way possible?
Are you looking for the derivative or the jacobian? Nesting AD is an option, so you can call ForwardDiff inside the loss
Thank you so much for the response! I think it is atm only the derivative. So how would that Look then with ForwardDiff? How can i pass the explicit input neuron to the ForwardDiff?
If you could provide the actual formulation of the loss function (or much better, a working example in another language), we'd have something to work with. This type of question however should probably be asked on Discourse.