LikelihoodProfiler.jl icon indicating copy to clipboard operation
LikelihoodProfiler.jl copied to clipboard

Cannot use `IntegrationProfiler` with implicit solvers

Open TorkelE opened this issue 11 months ago • 1 comments

This might totally be a thing that you shouldn't do (but if so it might make sense to mention it in the docs). Either case, running this is fine:

using Optimization, ForwardDiff
rosenbrock(x,p) = (1.0 - x[1])^2 + 100.0*(x[2] - x[1]^2)^2
x0 = zeros(2)
optf = OptimizationFunction(rosenbrock, AutoForwardDiff())
optprob = OptimizationProblem(optf, x0)
sol = solve(optprob, Optimization.LBFGS())

using LikelihoodProfiler, Plots
optpars = sol.u
plprob = PLProblem(optprob, optpars, (-10.,10.); threshold = 4.0)

using OrdinaryDiffEq
method = IntegrationProfiler(integrator = Tsit5(), integrator_opts = (dtmax=0.3,), matrix_type = :hessian)
sol = profile(plprob, method)

Trying another explicit solver also works:

method = IntegrationProfiler(integrator = Vern7(), integrator_opts = (dtmax=0.3,), matrix_type = :hessian)
sol = profile(plprob, method)

But implicit ones fail:

method = IntegrationProfiler(integrator = Rodas5P(), integrator_opts = (dtmax=0.3,), matrix_type = :hessian)
sol = profile(plprob, method)
method = IntegrationProfiler(integrator = Rosenbrock23(), integrator_opts = (dtmax=0.3,), matrix_type = :hessian)
sol = profile(plprob, method)

with a

ERROR: First call to automatic differentiation for the Jacobian
failed. This means that the user `f` function is not compatible
with automatic differentiation. Methods to fix this include:

1. Turn off automatic differentiation (e.g. Rosenbrock23() becomes
   Rosenbrock23(autodiff=false)). More details can befound at
   https://docs.sciml.ai/DiffEqDocs/stable/features/performance_overloads/
2. Improving the compatibility of `f` with ForwardDiff.jl automatic
   differentiation (using tools like PreallocationTools.jl). More details
   can be found at https://docs.sciml.ai/DiffEqDocs/stable/basics/faq/#Autodifferentiation-and-Dual-Numbers
3. Defining analytical Jacobians. More details can be
   found at https://docs.sciml.ai/DiffEqDocs/stable/types/ode_types/#SciMLBase.ODEFunction

Note: turning off automatic differentiation tends to have a very minimal
performance impact (for this use case, because it's forward mode for a
square Jacobian. This is different from optimization gradient scenarios).
However, one should be careful as some methods are more sensitive to
accurate gradients than others. Specifically, Rodas methods like `Rodas4`
and `Rodas5P` require accurate Jacobians in order to have good convergence,
while many other methods like BDF (`QNDF`, `FBDF`), SDIRK (`KenCarp4`),
and Rosenbrock-W (`Rosenbrock23`) do not. Thus if using an algorithm which
is sensitive to autodiff and solving at a low tolerance, please change the
algorithm as well.

MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1})
The type `Float64` exists, but no method is defined for this combination of argument types when trying to construct it.

Which is quite explanatory. Running

method = IntegrationProfiler(integrator = Rosenbrock23(autodiff = false), integrator_opts = (dtmax=0.3,), matrix_type = :hessian)
sol = profile(plprob, method)

is ok.

Is this all fine? Don't know much about this, but figure I should report it at least.

TorkelE avatar Mar 19 '25 11:03 TorkelE

Thank you! It should work. I need to look into it.

ivborissov avatar Mar 19 '25 11:03 ivborissov