cseveren

Results 8 comments of cseveren

Useful reference https://discourse.julialang.org/t/gradient-rise-was-obtained-by-optim-jl-package-optimization/69548/11

This is even with the default option for these solved of `allow_f_increases = false`. When instead set `allow_f_increases = true`, the divergence away from the local minimum and much longer...

Thanks John -- Time for each iteration goes from ~40 secs to ~1200 secs. I imagine it's 1. and not 2., but I'm not sure how to differentiate? I have...

Thanks much for the help. Here's the output from the REPL showing the trace each iteration, running with `allow_f_increases = true` and `@time` for time-keeping, starting from an initial point...

Using the same inputs as the prior example, I started the optimization routine twice, stopping once with `maxiter=11` and once with `maxiter=12` hoping to capture a big difference `f(x)` calls...

Ok -- In my use case, following the third option above (not exactly, I made a couple errors) sped execution time ~100x. It still drags toward the end bc it's...

Much faster than my hack-y solution! Took only 20-30 seconds.

Because this may not be an `Optim` issue, I also posted here: https://discourse.julialang.org/t/gradient-norm-does-not-change-in-optim-using-autodiff/94215