Optimization.jl icon indicating copy to clipboard operation
Optimization.jl copied to clipboard

Maxiters not respected

Open MaAl13 opened this issue 1 year ago • 6 comments

Hello,

i have been lately running the docu for parameter optimization of ODEs https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/

However, i tried to decrease the maxiters to 5 for multiple algorithms, but none of them respected the maxiters that were set. How can i avoid this problem?

MaAl13 avatar Aug 02 '22 11:08 MaAl13

Which algorithms?

ChrisRackauckas avatar Aug 02 '22 11:08 ChrisRackauckas

PolyOpt and BFGS. If i just set maxiters to 5 they produce more than 5 iterations in my case

MaAl13 avatar Aug 02 '22 11:08 MaAl13

Both of those are BFGS from Optim.jl, and that's a known issue with Optim.jl.

ChrisRackauckas avatar Aug 02 '22 11:08 ChrisRackauckas

Alright, so which package would you then recommend instead?

MaAl13 avatar Aug 02 '22 11:08 MaAl13

Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?

MaAl13 avatar Aug 02 '22 11:08 MaAl13

NLopt.jl tends to act better. PolyOpt should probably change to using that.

Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?

That's not a sensible question. DiffEqParamEstim is just a system which automates the generation of loss functions. You still have to choose an optimizer to optimize the loss function with, and the most sensible choice in 2022 would be Optimization.jl

ChrisRackauckas avatar Aug 02 '22 14:08 ChrisRackauckas