Optimization.jl
Optimization.jl copied to clipboard
Maxiters not respected
Hello,
i have been lately running the docu for parameter optimization of ODEs https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/
However, i tried to decrease the maxiters to 5 for multiple algorithms, but none of them respected the maxiters that were set. How can i avoid this problem?
Which algorithms?
PolyOpt and BFGS. If i just set maxiters to 5 they produce more than 5 iterations in my case
Both of those are BFGS from Optim.jl, and that's a known issue with Optim.jl.
Alright, so which package would you then recommend instead?
Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?
NLopt.jl tends to act better. PolyOpt should probably change to using that.
Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?
That's not a sensible question. DiffEqParamEstim is just a system which automates the generation of loss functions. You still have to choose an optimizer to optimize the loss function with, and the most sensible choice in 2022 would be Optimization.jl