Lots of "function cannot be proven read-only" failures suddenly popping up
Function argument passed to autodiff cannot be proven readonly.
If the the function argument cannot contain derivative data, instead call autodiff(Mode, Const(f), ...)
See https://enzyme.mit.edu/index.fcgi/julia/stable/faq/#Activity-of-temporary-storage for more information.
The potentially writing call is store {} addrspace(10)* %.fca.0.1.extract, {} addrspace(10)** %.fca.0.1.gep, align 8, !dbg !9, !noalias !21, using %.fca.0.1.gep = getelementptr inbounds { { {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, double }, [1 x i8] }, { { {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, double }, [1 x i8] }* %.innerparm, i64 0, i32 0, i32 1, !dbg !9
https://github.com/SciML/OrdinaryDiffEq.jl/actions/runs/14539831121/job/40795481724?pr=2677#step:6:1384
https://github.com/SciML/OrdinaryDiffEq.jl/actions/runs/14539831121/job/40795503066?pr=2677#step:6:1392
Seen in multiple CIs. Any clue as to where this is coming from? Was there a recent breaking change in Enzyme?
There hasn't been a release of Enzyme.jl in a while, so unlikely?
Do you have an MWE?
@gdalle since the error is coming from DI
@ChrisRackauckas did you ever use to run these tests with Enzyme being called directly through its native API? Or was Enzyme not tested at all in OrdinaryDiffEq before the DI overhaul? Most importantly, was there ever a point in time where these tests passed with the DI version? If the answer to the last question is yes, and these tests only started failing today, then the culprit may be v0.18.36 of Enzyme (released today). So I wouldn't be so sure that the error is "coming from" DI, because the Enzyme bindings there haven't changed in a while. I presume something was altered in Enzyme regarding the way that functions are proven to be read-only?
DI offers a solution for this, which is to use AutoEnzyme(function_annotation=Enzyme.Const).
Now that my PR on exception types has been merged here, once Enzyme tags a new release, DI can use these types to provide nicer error hints for situations like the current one. But it doesn't change the fact that if these failures appeared today, Enzyme's behavior possibly changed for real, and DI may just be the messenger.
@gdalle That patch release basically only contained your error type PR. So if only that patch errors we can revert and re patch
@ChrisRackauckas can you try your stuff with the previous version of Enzyme?
In the interim I reverted your error PR and re patched, since that is the only thing that touched anything vaguely related.
Once this is fixed @gdalle feel free to re open your PR (ideally with a test if it is the cause here)
These tests were passing until yesterday. I can try re-running the tests and see what happens with the new patch.
Seems like something is still sad. @gdalle can you get a reproducer without DI?
Honestly this stuff is buried deep inside OrdinaryDiffEq, so the complexity doesn't come from DI itself. People used to the SciML stack will get it done much faster (also I'm on vacation this week). The truth is that, as more and more packages build upon DI, it will get harder and harder to boil everything down to a pure-Enzyme MWE. In this case, DI enables using Enzyme inside an ODE, so yes we see failures that we may not see outside of ODEs. Whether that's a good or a bad thing depends on your goals, but I'd say it's rather a good thing.
There were four PRs in the Enzyme v0.18.36 release, which seems to have caused the breakage. We've established it wasn't my error PR, which leaves only three. You don't have any clue which of those it could be and why? Or could it be a change in the jll instead of the Julia bindings? Or another dep?
The three PRs in v0.13.36 are:
- #2350 (Fairly straight-forward since it just filters out some pre-header instructions)
- #2352 (Show to be unrelated (and I would have been very surprised if that was the case, since it was essentiatlly NFC))
- #2342 Likely unrelated since it fixes an error during compilation
v0.13.35:
- Supports GPUCompiler 1.3
@ChrisRackauckas do you know which Enzyme version was used prior? It might be the case that there was an incompatibility somewhere else that blocked a couple of updates?
Yeah there similarly hasn't been a jll release for like a month (I just made one, after all this).
Maybe the GPUCompiler compat meant you had an old Enzyme?
If it helps, here's a CI run from 2 weeks ago:
https://github.com/SciML/OrdinaryDiffEq.jl/actions/runs/14367632037/job/40284221662
Looks like we had a lull for a week in that repo. The one from 2 weeks ago was fine, the Aqua tests fail for a different reason (just an unfinished PR branch), the Taylor one didn't run, and then the initdt one from 4 days ago
https://github.com/SciML/OrdinaryDiffEq.jl/actions/runs/14522100726/job/40745257902#step:6:1388
seems to be the first occurrence of it in our CI. So a good week in there, longer time frame than I'd thought, but maybe that helps narrow down package version changes?
The failing one was:
[a0c0ee7d] DifferentiationInterface v0.6.52
[7da242da] Enzyme v0.13.35
[f151be2c] EnzymeCore v0.8.8
[7cc45869] Enzyme_jll v0.0.173+
while the one that passed was:
[a0c0ee7d] DifferentiationInterface v0.6.50
[f151be2c] EnzymeCore v0.8.8
[7da242da] Enzyme v0.13.35
[7cc45869] Enzyme_jll v0.0.173+0
So my guess then is that it must be DI related?
Testing this theory here: https://github.com/SciML/OrdinaryDiffEq.jl/pull/2680
Nothing weird is jumping at me https://github.com/JuliaDiff/DifferentiationInterface.jl/compare/DifferentiationInterface-v0.6.50...DifferentiationInterface-v0.6.52
Nothing between these two DI versions seems to have anything to do with Enzyme. Can the change of behavior be linked to some other part of the SciML stack, which would have made Enzyme's read-only analysis behave differently?
Update is that setting DI to v0.6.50 didn't work. https://github.com/SciML/OrdinaryDiffEq.jl/actions/runs/14579019896/job/40891421732?pr=2680#step:6:1381 . I have no idea what it could be then. I don't even know what I'm looking for. Did a Julia patch come out that I don't know about or something?
Can you provide an MWE with two complete manifests, one in which it fails and one in which it succeeds? It's easier than digging through CI. Given the minor differences in DI and Enzyme though, I suspect this is a SciML issue
Okay, I figured out how to do it and what's kind of going on. using Enzyme in a different test triggers the ability to use Enzyme for vjps in the line search method of NonlinearSolve.jl. So a different change enabled this to now test more things.
But... it's not clear why this would fail in the first place?
using FastClosures, NonlinearSolve, Enzyme
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
prob_oop = DAEProblem{false}(f, du₀, u₀, tspan, p, differential_vars = differential_vars)
u = [1.0, 0, 0]
du = [-0.04, 0.04, 0.0]
nlequation = @closure (x, _) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
prob_oop.f(du_, u_, prob_oop.p, t)
end
nlfunc = NonlinearFunction(nlequation)
nlprob = NonlinearProblem(nlfunc, ifelse.(differential_vars, du, u))
nlsolve = FastShortcutNonlinearPolyalg()
nlsol = solve(nlprob, nlsolve)
When I pull this out of the package context, it works just fine. It seems like the pass for "Function argument passed to autodiff cannot be proven readonly." is module dependent? Running the same function in OrdinaryDiffEqNonlinearSolve causes it to not pass the test for Enzyme, but then in my REPL it's fine?
It seems like not a desired property for the test to pass/fail based on some randomness like that, so it seems like there is still something to do there.
Even with the same exact ForwardDiff context it does not reproduce outside of the package:
using FastClosures, NonlinearSolve, Enzyme
using OrdinaryDiffEqCore
using ForwardDiff: Dual
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
p = ForwardDiff.Dual{ForwardDiff.Tag{var"#f#35"{DAEProblem{Vector{Float64}, Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, DAEFunction{true, SciMLBase.FullSpecialize, typeof(f), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, Vector{Bool}}, BrownFullBasicInit{Float64, Nothing}, DFBDF{5, 0, AutoForwardDiff{nothing, Nothing}, Nothing, OrdinaryDiffEqNonlinearSolve.NLNewton{Rational{Int64}, Rational{Int64}, Rational{Int64}, Rational{Int64}}, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}(), true, nothing, Nothing, Nothing}}, Float64}, Float64, 3}[Dual{ForwardDiff.Tag{var"#f#35"{DAEProblem{Vector{Float64}, Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, DAEFunction{true, SciMLBase.FullSpecialize, typeof(f), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, Vector{Bool}}, BrownFullBasicInit{Float64, Nothing}, DFBDF{5, 0, AutoForwardDiff{nothing, Nothing}, Nothing, OrdinaryDiffEqNonlinearSolve.NLNewton{Rational{Int64}, Rational{Int64}, Rational{Int64}, Rational{Int64}}, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}(), true, nothing, Nothing, Nothing}}, Float64}}(0.04,1.0,0.0,0.0), Dual{ForwardDiff.Tag{var"#f#35"{DAEProblem{Vector{Float64}, Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, DAEFunction{true, SciMLBase.FullSpecialize, typeof(f), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, Vector{Bool}}, BrownFullBasicInit{Float64, Nothing}, DFBDF{5, 0, AutoForwardDiff{nothing, Nothing}, Nothing, OrdinaryDiffEqNonlinearSolve.NLNewton{Rational{Int64}, Rational{Int64}, Rational{Int64}, Rational{Int64}}, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}(), true, nothing, Nothing, Nothing}}, Float64}}(3.0e7,0.0,1.0,0.0), Dual{ForwardDiff.Tag{var"#f#35"{DAEProblem{Vector{Float64}, Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, DAEFunction{true, SciMLBase.FullSpecialize, typeof(f), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, Vector{Bool}}, BrownFullBasicInit{Float64, Nothing}, DFBDF{5, 0, AutoForwardDiff{nothing, Nothing}, Nothing, OrdinaryDiffEqNonlinearSolve.NLNewton{Rational{Int64}, Rational{Int64}, Rational{Int64}, Rational{Int64}}, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}(), true, nothing, Nothing, Nothing}}, Float64}}(10000.0,0.0,0.0,1.0)]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
prob_oop = DAEProblem{false}(f, eltype(p).(du₀), eltype(p).(u₀), eltype(p).(tspan), p, differential_vars = differential_vars)
u = eltype(p).([1.0, 0, 0])
du = eltype(p).([-0.04, 0.04, 0.0])
nlequation = @closure (x, _) -> begin
du_ = ifelse.(prob_oop.differential_vars, x, du)
u_ = ifelse.(prob_oop.differential_vars, u, x)
prob_oop.f(du_, u_, prob_oop.p, t)
end
nlfunc = NonlinearFunction(nlequation)
nlprob = NonlinearProblem(nlfunc, ifelse.(differential_vars, du, u))
nlsolve = FastShortcutNonlinearPolyalg()
nlsol = solve(nlprob, nlsolve)
Update: it's just really finicky, honing in on an MWE:
using FastClosures, NonlinearSolve, Enzyme
using OrdinaryDiffEqCore
using ForwardDiff: Dual
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
prob_oop = DAEProblem{false}(f, du₀, u₀, tspan, p, differential_vars = differential_vars)
u = copy(u₀)
du = copy(du₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(prob_oop.differential_vars, x, du)
u_ = ifelse.(prob_oop.differential_vars, u, x)
prob_oop.f(du_, u_, prob_oop.p, t)
end
nlfunc = NonlinearFunction(nlequation; jac_prototype = nothing)
nlprob = NonlinearProblem(nlfunc, ifelse.(differential_vars, du, u))
alg = FastShortcutNonlinearPolyalg(
eltype(prob.u0);
Val(false),
Val(false),
u0_len = length(prob.u0)
)
nlsol = solve(nlprob, NewtonRaphson(linesearch = BackTracking()))
I think I can get it a step closer.
using FastClosures, NonlinearSolve, Enzyme
using OrdinaryDiffEqCore
using ForwardDiff: Dual
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
prob_oop = DAEProblem{false}(f, du₀, u₀, tspan, p, differential_vars = differential_vars)
u = copy(u₀)
du = copy(du₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(prob_oop.differential_vars, x, du)
u_ = ifelse.(prob_oop.differential_vars, u, x)
prob_oop.f(du_, u_, prob_oop.p, t)
end
nlfunc = NonlinearFunction(nlequation; jac_prototype = nothing)
nlprob = NonlinearProblem(nlfunc, ifelse.(differential_vars, du, u))
cache = init(nlprob, BackTracking(), du, u; autodiff = ADTypes.AutoEnzyme(; mode = Enzyme.Reverse))
solve!(cache, du, u)
Removed the nonlinear solver, now it's just in the line search.
Even simpler:
using FastClosures, NonlinearSolve, Enzyme
using OrdinaryDiffEqCore
using ForwardDiff: Dual
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
prob_oop = DAEProblem{false}(f, du₀, u₀, tspan, p, differential_vars = differential_vars)
u = copy(u₀)
du = copy(du₀)
fu = copy(u₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(prob_oop.differential_vars, x, du)
u_ = ifelse.(prob_oop.differential_vars, u, x)
prob_oop.f(du_, u_, prob_oop.p, t)
end
nlfunc = NonlinearFunction(nlequation; jac_prototype = nothing)
nlprob = NonlinearProblem(nlfunc, ifelse.(differential_vars, du, u))
autodiff = ADTypes.AutoEnzyme(; mode = Enzyme.Reverse)
using SciMLJacobianOperators
vjp_op = SciMLJacobianOperators.prepare_vjp(SciMLJacobianOperators.False, nlprob, nlprob.f, u, fu; autodiff = autodiff)
vjp = @closure (du, u, fu, p) -> dot(du, vjp_op(fu, u, p))
vjp(du, u, fu, p)
Okay, ignore above. Those are notes for me to remember how I got here.
I got an MWE! I got this boiled to to basically just DI and Enzyme. The MWE is:
using FastClosures, NonlinearSolve, Enzyme, LinearAlgebra
using SciMLBase
using DifferentiationInterface: DifferentiationInterface as DI
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
_f = DAEFunction{false}(f)
u = copy(u₀)
du = copy(du₀)
fu = copy(u₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
_f(du_, u_, p, t)
end
# Enzyme.Forward hits the same error
autodiffchoice = ADTypes.AutoEnzyme(; mode = Enzyme.Reverse)
di_extras = DI.prepare_pullback(nlequation, autodiffchoice, u, (fu,), DI.Constant(p))
vjp_op = @closure (v, u, p) -> begin
return only(DI.pullback(
nlequation, di_extras, autodiffchoice, u, (reshape(v, size(fu)),), DI.Constant(p)))
end
vjp_op(fu, u, p)
ERROR: Function argument passed to autodiff cannot be proven readonly.
If the the function argument cannot contain derivative data, instead call autodiff(Mode, Const(f), ...)
See https://enzyme.mit.edu/index.fcgi/julia/stable/faq/#Activity-of-temporary-storage for more information.
The potentially writing call is store double %137, double addrspace(13)* %139, align 8, !dbg !191, !tbaa !184, !alias.scope !85, !noalias !86, using %137 = select i1 %.not550, double %136, double %133, !dbg !187
Stacktrace:
[1] setindex!
@ .\array.jl:987 [inlined]
[2] macro expansion
@ C:\Users\accou\.julia\packages\Enzyme\txp7d\src\compiler\interpreter.jl:851 [inlined]
[3] macro expansion
@ .\simdloop.jl:77 [inlined]
[4] override_bc_materialize
@ C:\Users\accou\.julia\packages\Enzyme\txp7d\src\compiler\interpreter.jl:849 [inlined]
[5] #31
@ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:64 [inlined]
[6] augmented_julia__31_32463_inner_3wrap
@ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:0
Now, this MWE is wild because SciMLBase is required, only for _f = DAEFunction{false}(f). All that does is wrap the function so it has a trivial call: https://github.com/SciML/SciMLBase.jl/blob/v2.85.0/src/scimlfunctions.jl#L2597. That seems to be required for the MWE. If I change from using _f to f inside of nlequation, I have the following MWE:
using FastClosures, NonlinearSolve, Enzyme, LinearAlgebra
using DifferentiationInterface: DifferentiationInterface as DI
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
u = copy(u₀)
du = copy(du₀)
fu = copy(u₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
f(du_, u_, p, t)
end
# Enzyme.Forward hits the same segfault
autodiffchoice = ADTypes.AutoEnzyme(; mode = Enzyme.Reverse)
di_extras = DI.prepare_pullback(nlequation, autodiffchoice, u, (fu,), DI.Constant(p))
vjp_op = @closure (v, u, p) -> begin
return only(DI.pullback(
nlequation, di_extras, autodiffchoice, u, (reshape(v, size(fu)),), DI.Constant(p)))
end
vjp_op(fu, u, p)
and this causes a segfault! So a wrapper on a function that does nothing changes from having it fail to infer that the function is writable and changes it to segfaulting... something is really fishy here.
So then questions:
- Why does it segfault? I get no message in the terminal at all, so maybe someone else needs to run it on a different OS to see.
- Why does
_f = DAEFunction{false}(f)with(f::DAEFunction)(args...) = f.f(args...)make it no longer segfault? That seems like it would be a no-op. - Why can't it infer that function is read only in the first place? Seems like a rather trivial function.
Does it err without DI?
Regarding it being not reproducing iin the repl:
u = copy(u₀)
du = copy(du₀)
fu = copy(u₀)
nlequation = @closure (x, _) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
f(du_, u_, p, t)
end
Has different semantics on top-level than it has inside a scope. E.g. does u get captured or does it turned into a GlobalRef.
Note that reproducing this outside of DI would be much easier if Enzyme provided utilities for VJPs with seeds different from 1. The purpose of #2309 was upstreaming that sugar, let me know if you need more from me there.
I suspect this would fail for Forward mode as well.
My colleague @efaulhaber yesterday found this:
using OrdinaryDiffEqRosenbrock, ADTypes
import Enzyme
function f(du, u, p, t)
du .= u .+ u .^2
end
u0 = [1.0, 0.5]
tspan = (0.0, 1.0)
prob = ODEProblem(f, u0, tspan)
# Solve using an implicit method
sol = solve(prob, Rodas5(autodiff = AutoEnzyme()))
Function argument passed to autodiff cannot be proven readonly.
If the the function argument cannot contain derivative data, instead call autodiff(Mode, Const(f), ...)
See https://enzyme.mit.edu/index.fcgi/julia/stable/faq/#Activity-of-temporary-storage for more information.
The potentially writing call is call void %20(i64 %19, {} addrspace(10)* nonnull %1, {} addrspace(10)* nonnull %2, {} addrspace(10)* addrspacecast ({}* inttoptr (i64 137725054489888 to {}*) to {} addrspace(10)*), double %8) [ "jl_roots"({} addrspace(10)* %2, {} addrspace(10)* %1) ], !dbg !58, using %20 = inttoptr i64 %value_phi to void (i64, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, double)*, !dbg !58
Stacktrace:
[1] jacobian!(J::Matrix{…}, f::Function, x::Vector{…}, fx::Vector{…}, integrator::OrdinaryDiffEqCore.ODEIntegrator{…}, jac_config::Tuple{…})
@ OrdinaryDiffEqDifferentiation ~/.julia/packages/OrdinaryDiffEqDifferentiation/M5647/src/derivative_wrappers.jl:223
and I would hope that DI uses Forward Mode for Jacobian?
Oh yes, sorry my examples also fail the same way with Forward mode. So yes that example found there seems to be hitting the same behavior. So I don't know if it's the way that DI wraps things, how we are setting them up for DI, or if the Enzyme check here is brittle (or a mixture of all) but it seems like however we are setup makes it easy to trigger and it would be good to find out why.
@ChrisRackauckas the crux for me is
nlequation = @closure (x, _) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
f(du_, u_, p, t)
end
We need a shadow for du and u since they are coming from the "environment"/"colsure".
If you don't use @closure then you get:
ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
Which is precisely what I would expect for global variables used like this.
Like:
using FastClosures, Enzyme
t = 0.0
p = [0.04, 3e7, 1e4]
u₀ = [1.0, 0, 0]
du₀ = [-0.04, 0.04, 0.0]
tspan = (0.0, 100000.0)
differential_vars = [true, true, false]
function f(du, u, p, t)
[-p[1] * u[1] + p[3] * u[2] * u[3] - du[1],
+p[1] * u[1] - p[2] * u[2]^2 - p[3] * u[2] * u[3] - du[2],
u[1] + u[2] + u[3] - 1.0]
end
u = copy(u₀)
du = copy(du₀)
fu = copy(u₀)
nlequation = @closure (x) -> begin
du_ = ifelse.(differential_vars, x, du)
u_ = ifelse.(differential_vars, u, x)
f(du_, u_, p, t)
end
Enzyme.jacobian(Forward, nlequation, u)
Errors with:
ERROR: Function argument passed to autodiff cannot be proven readonly.
If the the function argument cannot contain derivative data, instead call autodiff(Mode, Const(f), ...)
See https://enzyme.mit.edu/index.fcgi/julia/stable/faq/#Activity-of-temporary-storage for more information.
The potentially writing call is store double %136, double addrspace(13)* %138, align 8, !dbg !192, !tbaa !185, !alias.scope !86, !noalias !87, using %136 = select i1 %.not550, double %135, double %132, !dbg !188