ReverseDiff.jl
ReverseDiff.jl copied to clipboard
Potential bug all_results[1].value != fun(x)
I have runt into what I believe is a bug in ReverseDiff
I have made a gist with a program that should generate a weird behavior, along with the output of the program. I'm simply trying to fit a one layer neural network to some data. The function I differentiate is lossfun
. When running the program, the output of all_results[1].value
which I assume should be equal to lossfun
, in fact diverges from lossfun
after a while, having started out similar.
Program that reproduce bug, along with output: https://gist.github.com/baggepinnen/5c413f9ca12d5853672fd4e2d8dbdaea