Optim.jl icon indicating copy to clipboard operation
Optim.jl copied to clipboard

What does a constant `Gradient norm` indicate?

Open cseveren opened this issue 2 years ago • 1 comments

What is signified by the output Gradient norm being constant (stuck, not changing) across iterations when using NewtonTrustRegion. As an example, below is the output from the first two iterations of a minimization problem run using NewtonTrustRegion. My initial point is the output of a prior round of optimization, that also got stuck at this same Function value and Gradient norm.

Iter     Function value   Gradient norm 
     0     4.925179e+06     1.951972e+03
 * time: 0.00021505355834960938
     1     4.925179e+06     1.951972e+03
 * time: 0.0005660057067871094

Some more detail: If I switch to LBFGS the optimization successfully continues (Function value decreases), but of course gradient methods are slow, so it would be ideal to switch back to NewtonTrustRegion. Even if I let LBFGS run for a while so as to find a moderately different candidate minimizer, when I switch back to NewtonTrustRegion this same behavior of being stuck with constant Gradient norm re-emerges.

I would provide code but it is complicated and has a lot of data and it only occurs in some of the models I've run -- I'm really just hoping to get some intuition as to options/tuning parameters to adjust to bounce out of difficult spots. I have already tried allow_f_increases=true; that did not solve the issue.

Excellent package, many thanks.

cseveren avatar Jan 24 '23 23:01 cseveren

Because this may not be an Optim issue, I also posted here:

https://discourse.julialang.org/t/gradient-norm-does-not-change-in-optim-using-autodiff/94215

cseveren avatar Feb 07 '23 14:02 cseveren

No progress would be my bet. If you can provide more information I can reopen :)

pkofod avatar Apr 29 '24 20:04 pkofod