DifferentiationInterface.jl
DifferentiationInterface.jl copied to clipboard
Reactant prototype
First trial for #265
@wsmoses I run into a weird world age error in the test, any clue?
Codecov Report
Attention: Patch coverage is 0% with 34 lines in your changes missing coverage. Please review.
Project coverage is 5.06%. Comparing base (
c47e26c) to head (a5260d8).
| Files with missing lines | Patch % | Lines |
|---|---|---|
| .../ext/DifferentiationInterfaceReactantExt/onearg.jl | 0.00% | 31 Missing :warning: |
| ...ReactantExt/DifferentiationInterfaceReactantExt.jl | 0.00% | 3 Missing :warning: |
:exclamation: There is a different number of reports uploaded between BASE (c47e26c) and HEAD (a5260d8). Click for more details.
HEAD has 98 uploads less than BASE
Flag BASE (c47e26c) HEAD (a5260d8) DIT 20 1 DI 80 1
Additional details and impacted files
@@ Coverage Diff @@
## main #325 +/- ##
==========================================
- Coverage 98.57% 5.06% -93.51%
==========================================
Files 107 93 -14
Lines 4620 4465 -155
==========================================
- Hits 4554 226 -4328
- Misses 66 4239 +4173
| Flag | Coverage Δ | |
|---|---|---|
| DI | 7.40% <0.00%> (-91.27%) |
:arrow_down: |
| DIT | 0.13% <ø> (-98.22%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Hm that seems to imply that you're not using Reactant.compile
You can look at the code, but I really think I am
https://github.com/gdalle/DifferentiationInterface.jl/blob/ed381bcdb5c4eb2a83239a9cbaf3044a1accdf56/DifferentiationInterface/ext/DifferentiationInterfaceReactantExt/DifferentiationInterfaceReactantExt.jl#L8
https://github.com/gdalle/DifferentiationInterface.jl/blob/ed381bcdb5c4eb2a83239a9cbaf3044a1accdf56/DifferentiationInterface/ext/DifferentiationInterfaceReactantExt/onearg.jl#L1-L17
I get you, but also the error log clearly indicates it wasn't. There should be some sort of casette compile in the logs if so.
Closest candidates are:
(::Reactant.var"#109#110")(::Any) (method too new to be called from this world context.)
@ Reactant ~/.julia/packages/Reactant/DFtbF/src/Reactant.jl:850
Stacktrace:
[1] macro expansion
@ ~/.julia/packages/Enzyme/G8o86/src/utils.jl:0 [inlined]
[2] codegen_world_age(ft::Type{Reactant.var"#109#110"}, tt::Type{Tuple{Reactant.ConcreteRArray{Float64, (3,), 1}}})
@ Enzyme ~/.julia/packages/Enzyme/G8o86/src/utils.jl:168
[3] autodiff
@ ~/.julia/packages/Enzyme/G8o86/src/Enzyme.jl:242 [inlined]
[4] autodiff
@ ~/.julia/packages/Enzyme/G8o86/src/Enzyme.jl:321 [inlined]
[5] gradient(rm::EnzymeCore.ReverseMode{false, EnzymeCore.FFIABI, false}, f::Reactant.var"#109#110", x::Reactant.ConcreteRArray{Float64, (3,), 1})
@ Enzyme ~/.julia/packages/Enzyme/G8o86/src/Enzyme.jl:1005
[6] gradient(f::Function, backend::AutoEnzyme{Nothing}, x::Reactant.ConcreteRArray{Float64, (3,), 1}, ::DifferentiationInterface.NoGradientExtras)
@ DifferentiationInterfaceEnzymeExt ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/ext/DifferentiationInterfaceEnzymeExt/reverse_onearg.jl:124
[7] gradient(f::Function, rebackend::DifferentiationInterfaceReactantExt.ReactantBackend{AutoEnzyme{Nothing}}, x::Vector{Float64}, extras::DifferentiationInterfaceReactantExt.ReactantGradientExtras{Reactant.var"#109#110", DifferentiationInterface.NoGradientExtras})
@ DifferentiationInterfaceReactantExt ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/ext/DifferentiationInterfaceReactantExt/onearg.jl:16
[8] gradient(f::typeof(sum), backend::DifferentiationInterfaceReactantExt.ReactantBackend{AutoEnzyme{Nothing}}, x::Vector{Float64})
@ DifferentiationInterface ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/src/first_order/gradient.jl:74
[9] macro expansion
@ /opt/hostedtoolcache/julia/1.10.4/x64/share/julia/stdlib/v1.10/Test/src/Test.jl:669 [inlined]
[10] top-level scope
@ ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/test/Double/Enzyme-Reactant/test.jl:508
Are you trying to autodiff a reactant compiled function by chance? Reactant needs to be on the outside of all the gradient calls/etc?
Are you trying to autodiff a reactant compiled function by chance?
Yes, that's what the code above shows. I'm doing
DI.gradient(f_compiled, backend, x_reactant)
Reactant needs to be on the outside of all the gradient calls/etc?
I assume the question mark at the end is unintended? And I should be doing the following?
compile(x -> DI.gradient(f, backend, x_reactant))
yeah sorry ignore the question mark. Indeed reactant needs to compile the outermost function [e.g. compile the gradient call]
Still getting a world age error, even though now I'm compiling the gradient closure
@gdalle hm the log still thinks that you're trying to pass a compiled fn into autodiff rather than the other way round:
MethodError: no method matching (::Reactant.var"#109#110")(::Reactant.ConcreteRArray{Float64, (3,), 1})
The applicable method may be too new: running in world age 31490, while current world is 31491.
Closest candidates are:
(::Reactant.var"#109#110")(::Any) (method too new to be called from this world context.)
@ Reactant ~/.julia/packages/Reactant/DFtbF/src/Reactant.jl:850
Stacktrace:
[1] gradient(f::Function, rebackend::DifferentiationInterfaceReactantExt.ReactantBackend{AutoEnzyme{Nothing}}, x::Vector{Float64}, extras::DifferentiationInterfaceReactantExt.ReactantGradientExtras{Reactant.var"#109#110"})
@ DifferentiationInterfaceReactantExt ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/ext/DifferentiationInterfaceReactantExt/onearg.jl:16
[2] gradient(f::typeof(sum), backend::DifferentiationInterfaceReactantExt.ReactantBackend{AutoEnzyme{Nothing}}, x::Vector{Float64})
@ DifferentiationInterface ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/src/first_order/gradient.jl:74
[3] macro expansion
@ /opt/hostedtoolcache/julia/1.10.4/x64/share/julia/stdlib/v1.10/Test/src/Test.jl:669 [inlined]
[4] top-level scope
@ ~/work/DifferentiationInterface.jl/DifferentiationInterface.jl/DifferentiationInterface/test/Double/Enzyme-Reactant/test.jl:508
The entire code is here (gradient is automatically preceded by preparation) and I really don't see where I'm doing that:
https://github.com/gdalle/DifferentiationInterface.jl/blob/gd/reactant/DifferentiationInterface/ext/DifferentiationInterfaceReactantExt/onearg.jl
@gdalle okay I just released a reactant bump which fixes this
New kind of error:
conversion to pointer not defined for Reactant.TracedRArray{Float64, (6,), 1}
huh weird, open an issue with an MWE?
Tests are passing locally :partying_face: now onto implementing more operators