Enzyme.jl
Enzyme.jl copied to clipboard
Julia 1.11 Unsupported
I wanted to use split reverse mode in order to compute pullbacks with array outputs. The only doc I found is
so I tried it but it fails on Julia 1.11:
julia> using Enzyme
julia> A = [2.2]; ∂A = zero(A)
1-element Vector{Float64}:
0.0
julia> v = 3.3
3.3
julia> function f(A, v)
res = A[1] * v
A[1] = 0
res
end
f (generic function with 1 method)
julia> forward, reverse = autodiff_thunk(ReverseSplitWithPrimal, Const{typeof(f)}, Active, Duplicated{typeof(A)}, Active{typeof(v)})
ERROR: MethodError: no method matching get_inference_world(::Enzyme.Compiler.Interpreter.EnzymeInterpreter)
The function `get_inference_world` exists, but no method is defined for this combination of argument types.
Closest candidates are:
get_inference_world(::REPL.REPLCompletions.REPLInterpreter)
@ REPL ~/.julia/juliaup/julia-1.11.0-alpha2+0.x64.linux.gnu/share/julia/stdlib/v1.11/REPL/src/REPLCompletions.jl:550
get_inference_world(::Core.Compiler.NativeInterpreter)
@ Core compiler/types.jl:402
Stacktrace:
⋮ internal @ Core.Compiler, Enzyme.Compiler, GPUCompiler, Core, Unknown
[11] autodiff_thunk(::EnzymeCore.ReverseModeSplit{true, true, 0, true, FFIABI}, ::Type{Const{typeof(f)}}, ::Type{Active}, ::Type, ::Vararg{Type})
@ Enzyme ~/.julia/packages/Enzyme/l4FS0/src/Enzyme.jl:524
Use `err` to retrieve the full stack trace.
julia> err
1-element ExceptionStack:
MethodError: no method matching get_inference_world(::Enzyme.Compiler.Interpreter.EnzymeInterpreter)
The function `get_inference_world` exists, but no method is defined for this combination of argument types.
Closest candidates are:
get_inference_world(::REPL.REPLCompletions.REPLInterpreter)
@ REPL ~/.julia/juliaup/julia-1.11.0-alpha2+0.x64.linux.gnu/share/julia/stdlib/v1.11/REPL/src/REPLCompletions.jl:550
get_inference_world(::Core.Compiler.NativeInterpreter)
@ Core compiler/types.jl:402
Stacktrace:
[1] Core.Compiler.InferenceState(result::Core.Compiler.InferenceResult, cache_mode::UInt8, interp::Enzyme.Compiler.Interpreter.EnzymeInterpreter)
@ Core.Compiler ./compiler/inferencestate.jl:493
[2] Core.Compiler.InferenceState(result::Core.Compiler.InferenceResult, cache_mode::Symbol, interp::Enzyme.Compiler.Interpreter.EnzymeInterpreter)
@ Core.Compiler ./compiler/inferencestate.jl:499
[3] typeinf(interp::Enzyme.Compiler.Interpreter.EnzymeInterpreter, result::Core.Compiler.InferenceResult, cache_mode::Symbol)
@ Core.Compiler ./compiler/typeinfer.jl:9
[4] typeinf_type(interp::Enzyme.Compiler.Interpreter.EnzymeInterpreter, mi::Core.MethodInstance)
@ Core.Compiler ./compiler/typeinfer.jl:1072
[5] typeinf_type(interp::Enzyme.Compiler.Interpreter.EnzymeInterpreter, method::Method, atype::Any, sparams::Core.SimpleVector)
@ Core.Compiler ./compiler/typeinfer.jl:1059
[6] (::Enzyme.Compiler.var"#532#533"{…})(ctx::LLVM.Context)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/l4FS0/src/compiler.jl:5495
[7] JuliaContext(f::Enzyme.Compiler.var"#532#533"{…})
@ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/driver.jl:47
[8] #s1883#531
@ ~/.julia/packages/Enzyme/l4FS0/src/compiler.jl:5482 [inlined]
[9]
@ Enzyme.Compiler ./none:0
[10] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core ./boot.jl:705
[11] autodiff_thunk(::EnzymeCore.ReverseModeSplit{true, true, 0, true, FFIABI}, ::Type{Const{typeof(f)}}, ::Type{Active}, ::Type, ::Vararg{Type})
@ Enzyme ~/.julia/packages/Enzyme/l4FS0/src/Enzyme.jl:524
[12] top-level scope
@ REPL[23]:1
Some type information was truncated. Use `show(err)` to see complete types.
(docs) pkg> st
Status `~/Work/GitHub/Julia/DifferentiationInterface.jl/docs/Project.toml`
[47edcb42] ADTypes v0.2.7
[d360d2e6] ChainRulesCore v1.23.0
[0ca39b1e] Chairmarks v1.2.0
[a93c6f00] DataFrames v1.6.1
[163ba53b] DiffResults v1.1.0
[a0c0ee7d] DifferentiationInterface v0.1.0 `..`
[9f5e2b26] Diffractor v0.2.5
[e30172f5] Documenter v1.3.0
[a078cd44] DocumenterMermaid v0.1.1
[7da242da] Enzyme v0.11.20
[eb9bf01b] FastDifferentiation v0.3.6
[1a297f60] FillArrays v1.9.3
[6a86dc24] FiniteDiff v2.23.0
[26cc04aa] FiniteDifferences v0.12.31
[f6369f11] ForwardDiff v0.10.36
[c3a54625] JET v0.8.29
[98d1487c] PolyesterForwardDiff v0.1.1
[37e2e3b7] ReverseDiff v1.15.1
[9f7883ad] Tracker v0.2.33
[e88e6eb3] Zygote v0.6.69
[d6f4376e] Markdown v1.11.0
[9a3f8284] Random v1.11.0
[8dfed614] Test v1.11.0
julia> versioninfo()
Julia Version 1.11.0-alpha2
Commit 9dfd28ab751 (2024-03-18 20:35 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: Linux (x86_64-linux-gnu)
CPU: 12 × Intel(R) Core(TM) i7-8850H CPU @ 2.60GHz
WORD_SIZE: 64
LLVM: libLLVM-16.0.6 (ORCJIT, skylake)
Threads: 1 default, 0 interactive, 1 GC (on 12 virtual cores)
We need an update to the new GPUCompiler version
I'm getting the following error here
ERROR: LoadError: UndefVarError: `PassBuilder` not defined in `Enzyme.Compiler`
Stacktrace:
[1] macro expansion
@ ~/.julia/packages/LLVM/5DlHM/src/base.jl:96 [inlined]
[2] (::Enzyme.Compiler.var"#prop_julia_addr#28416"{LLVM.TargetMachine})(f::LLVM.Function)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:75
[3] function_pass_callback(ptr::Ptr{Nothing}, data::Ptr{Nothing})
@ LLVM ~/.julia/packages/LLVM/5DlHM/src/pass.jl:49
[4] LLVMRunPassManager
@ ~/.julia/packages/LLVM/5DlHM/lib/16/libLLVM.jl:3351 [inlined]
[5] run!
@ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:39 [inlined]
[6] (::Enzyme.Compiler.var"#28512#28513"{LLVM.Module, LLVM.TargetMachine})(pm::LLVM.ModulePassManager)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:2029
[7] LLVM.ModulePassManager(::Enzyme.Compiler.var"#28512#28513"{LLVM.Module, LLVM.TargetMachine}; kwargs::@Kwargs{})
@ LLVM ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:33
[8] ModulePassManager
@ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:30 [inlined]
[9] optimize!
@ ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:1951 [inlined]
[10] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:5787
[11] codegen
@ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:5194 [inlined]
[12] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6682
[13] _thunk
@ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6682 [inlined]
[14] cached_compilation
@ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6720 [inlined]
[15] (::Enzyme.Compiler.var"#28633#28634"{Active, FFIABI, Const{typeof(loss_function)}, Enzyme.API.DEM_ReverseModeCombined, (false, false, false, false, false, false), true, false, Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}, 0x00000000000068d4, 1, Core.MethodInstance})(ctx::LLVM.Context)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6795
[16] JuliaContext(f::Enzyme.Compiler.var"#28633#28634"{Active, FFIABI, Const{typeof(loss_function)}, Enzyme.API.DEM_ReverseModeCombined, (false, false, false, false, false, false), true, false, Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}, 0x00000000000068d4, 1, Core.MethodInstance}; kwargs::@Kwargs{})
@ GPUCompiler ~/.julia/packages/GPUCompiler/Y4hSX/src/driver.jl:52
[17] JuliaContext
@ ~/.julia/packages/GPUCompiler/Y4hSX/src/driver.jl:42 [inlined]
[18] thunkbase(mi::Core.MethodInstance, ::Val{0x00000000000068d4}, ::Type{Const{typeof(loss_function)}}, ::Type{Active}, tt::Type{Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}}, ::Val{Enzyme.API.DEM_ReverseModeCombined}, ::Val{1}, ::Val{(false, false, false, false, false, false)}, ::Val{true}, ::Val{false}, ::Type{FFIABI})
@ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6740
[19] #s2021#28635
@ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6826 [inlined]
[20] var"#s2021#28635"(FA::Any, A::Any, TT::Any, Mode::Any, ModifiedBetween::Any, width::Any, ReturnPrimal::Any, ShadowInit::Any, World::Any, ABI::Any, ::Any, ::Any, ::Any, ::Any, tt::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any)
@ Enzyme.Compiler ./none:0
[21] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core ./boot.jl:709
[22] autodiff
@ ~/.julia/packages/Enzyme/Pljwm/src/Enzyme.jl:309 [inlined]
[23] autodiff
@ ~/.julia/packages/Enzyme/Pljwm/src/Enzyme.jl:326 [inlined]
[24] gradient_loss_function(model::Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}, x::Matrix{Float32}, y::OneHotMatrix{UInt32, Vector{UInt32}}, ps::@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}})
@ Main ~/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:65
[25] top-level scope
@ ~/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:78
[26] include(fname::String)
@ Main ./sysimg.jl:38
[27] top-level scope
@ ~/work/Reactant.jl/Reactant.jl/test/runtests.jl:49
[28] include(fname::String)
@ Main ./sysimg.jl:38
[29] top-level scope
@ none:6
in expression starting at /home/runner/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:78
in expression starting at /home/runner/work/Reactant.jl/Reactant.jl/test/runtests.jl:49
Package Reactant errored during testing
Is this on main?
Is this on main?
I am getting this on main
What is your Manifest? Did you re-resolve. This means you ended up with an old version of GPUCompiler most likely.
I think Enzyme has an old GPUCompiler in its compat, which we should likely remove
With Enzyme#main
, Enzyme_jll#main
, GPUCompiler#master
I am still getting failure on 1.11
using Enzyme
x = rand(Float32, 32);
Enzyme.gradient(Reverse, sum, x)
Error
Closest candidates are:
cpu_features!(::LLVM.Module)
@ GPUCompiler ~/.julia/packages/GPUCompiler/89rev/src/optim.jl:290
Stacktrace:
[1] (::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine})(pm::LLVM.ModulePassManager)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler/optimize.jl:1965
[2] LLVM.ModulePassManager(::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine}; kwargs::@Kwargs{})
@ LLVM ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:33
[3] ModulePassManager
@ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:30
[4] optimize!(mod::LLVM.Module, tm::LLVM.TargetMachine)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler/optimize.jl:1951
[5] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:5807
[6] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6710
[7] cached_compilation
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6748 [inlined]
[8] thunkbase(ctx::LLVM.Context, mi::Core.MethodInstance, ::Val{0x0000000000006819}, ::Type{Const{…}}, ::Type{Active}, tt::Type{Tuple{…}}, ::Val{Enzyme.API.DEM_ReverseModeCombined}, ::Val{1}, ::Val{(false, false)}, ::Val{false}, ::Val{false}, ::Type{FFIABI})
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6821
[9] #s2021#28415
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6873 [inlined]
[10] var"#s2021#28415"(FA::Any, A::Any, TT::Any, Mode::Any, ModifiedBetween::Any, width::Any, ReturnPrimal::Any, ShadowInit::Any, World::Any, ABI::Any, ::Any, ::Any, ::Any, ::Any, tt::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any)
@ Enzyme.Compiler ./none:0
[11] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core ./boot.jl:709
[12] autodiff
@ ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:309 [inlined]
[13] autodiff
@ ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:326 [inlined]
[14] gradient(rm::ReverseMode{false, FFIABI, false}, f::typeof(sum), x::Vector{Float32})
@ Enzyme ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:1027
[15] top-level scope
@ REPL[3]:1
[16] top-level scope
@ none:1
Some type information was truncated. Use `show(err)` to see complete types.
Manifest.toml
# This file is machine-generated - editing it directly is not advised
julia_version = "1.11.0-rc1"
manifest_format = "2.0"
project_hash = "9d9865fdd982a60cc61fa85e3720d1f44ecb386c"
[[deps.ArgTools]]
uuid = "0dad84c5-d112-42e6-8d28-ef12dabb789f"
version = "1.1.2"
[[deps.Artifacts]]
uuid = "56f22d72-fd6d-98f1-02f0-08ddc0907c33"
version = "1.11.0"
[[deps.Base64]]
uuid = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f"
version = "1.11.0"
[[deps.CEnum]]
git-tree-sha1 = "389ad5c84de1ae7cf0e28e381131c98ea87d54fc"
uuid = "fa961155-64e5-5f13-b03f-caf6b980ea82"
version = "0.5.0"
[[deps.CompilerSupportLibraries_jll]]
deps = ["Artifacts", "Libdl"]
uuid = "e66e0078-7015-5450-92f7-15fbd957f2ae"
version = "1.1.1+0"
[[deps.Dates]]
deps = ["Printf"]
uuid = "ade2ca70-3891-5945-98fb-dc099432e06a"
version = "1.11.0"
[[deps.Downloads]]
deps = ["ArgTools", "FileWatching", "LibCURL", "NetworkOptions"]
uuid = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
version = "1.6.0"
[[deps.Enzyme]]
deps = ["CEnum", "EnzymeCore", "Enzyme_jll", "GPUCompiler", "LLVM", "Libdl", "LinearAlgebra", "ObjectFile", "Preferences", "Printf", "Random"]
git-tree-sha1 = "b11d3c1aa7166ef05331ee762e7e1108722af436"
repo-rev = "main"
repo-url = "https://github.com/EnzymeAD/Enzyme.jl.git"
uuid = "7da242da-08ed-463a-9acd-ee780be4f1d9"
version = "0.12.24"
[deps.Enzyme.extensions]
EnzymeChainRulesCoreExt = "ChainRulesCore"
EnzymeLogExpFunctionsExt = "LogExpFunctions"
EnzymeSpecialFunctionsExt = "SpecialFunctions"
EnzymeStaticArraysExt = "StaticArrays"
[deps.Enzyme.weakdeps]
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
LogExpFunctions = "2ab3a3ac-af41-5b50-aa03-7779005ae688"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
StaticArrays = "90137ffa-7385-5640-81b9-e52037218182"
[[deps.EnzymeCore]]
git-tree-sha1 = "d445df66dd8761a4c27df950db89c6a3a0629fe7"
uuid = "f151be2c-9106-41f4-ab19-57ee4f262869"
version = "0.7.7"
[deps.EnzymeCore.extensions]
AdaptExt = "Adapt"
[deps.EnzymeCore.weakdeps]
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
[[deps.Enzyme_jll]]
deps = ["Artifacts", "JLLWrappers", "LazyArtifacts", "Libdl", "TOML"]
git-tree-sha1 = "a509ac21f4f44df224bf7cb6901fa4236d015b5e"
repo-rev = "main"
repo-url = "https://github.com/JuliaBinaryWrappers/Enzyme_jll.jl.git"
uuid = "7cc45869-7501-5eee-bdea-0790c847d4ef"
version = "0.0.136+0"
[[deps.ExprTools]]
git-tree-sha1 = "27415f162e6028e81c72b82ef756bf321213b6ec"
uuid = "e2ba6199-217a-4e67-a87a-7c52f15ade04"
version = "0.1.10"
[[deps.FileWatching]]
uuid = "7b1f6079-737a-58dc-b8bc-7a2ca5c1b5ee"
version = "1.11.0"
[[deps.GPUCompiler]]
deps = ["ExprTools", "InteractiveUtils", "LLVM", "Libdl", "Logging", "PrecompileTools", "Preferences", "Scratch", "Serialization", "TOML", "TimerOutputs", "UUIDs"]
git-tree-sha1 = "36e1cbe62869fe2a95958d0d3fcd5dad47cac6fd"
repo-rev = "master"
repo-url = "https://github.com/JuliaGPU/GPUCompiler.jl.git"
uuid = "61eb1bfa-7361-4325-ad38-22787b887f55"
version = "0.26.7"
[[deps.InteractiveUtils]]
deps = ["Markdown"]
uuid = "b77e0a4c-d291-57a0-90e8-8db25a27a240"
version = "1.11.0"
[[deps.JLLWrappers]]
deps = ["Artifacts", "Preferences"]
git-tree-sha1 = "7e5d6779a1e09a36db2a7b6cff50942a0a7d0fca"
uuid = "692b3bcd-3c85-4b1f-b108-f13ce0eb3210"
version = "1.5.0"
[[deps.LLVM]]
deps = ["CEnum", "LLVMExtra_jll", "Libdl", "Preferences", "Printf", "Requires", "Unicode"]
git-tree-sha1 = "020abd49586480c1be84f57da0017b5d3db73f7c"
uuid = "929cbde3-209d-540e-8aea-75f648917ca0"
version = "8.0.0"
[deps.LLVM.extensions]
BFloat16sExt = "BFloat16s"
[deps.LLVM.weakdeps]
BFloat16s = "ab4f0b2a-ad5b-11e8-123f-65d77653426b"
[[deps.LLVMExtra_jll]]
deps = ["Artifacts", "JLLWrappers", "LazyArtifacts", "Libdl", "TOML"]
git-tree-sha1 = "c2636c264861edc6d305e6b4d528f09566d24c5e"
uuid = "dad2f222-ce93-54a1-a47d-0025e8a3acab"
version = "0.0.30+0"
[[deps.LazyArtifacts]]
deps = ["Artifacts", "Pkg"]
uuid = "4af54fe1-eca0-43a8-85a7-787d91b784e3"
version = "1.11.0"
[[deps.LibCURL]]
deps = ["LibCURL_jll", "MozillaCACerts_jll"]
uuid = "b27032c2-a3e7-50c8-80cd-2d36dbcbfd21"
version = "0.6.4"
[[deps.LibCURL_jll]]
deps = ["Artifacts", "LibSSH2_jll", "Libdl", "MbedTLS_jll", "Zlib_jll", "nghttp2_jll"]
uuid = "deac9b47-8bc7-5906-a0fe-35ac56dc84c0"
version = "8.6.0+0"
[[deps.LibGit2]]
deps = ["Base64", "LibGit2_jll", "NetworkOptions", "Printf", "SHA"]
uuid = "76f85450-5226-5b5a-8eaa-529ad045b433"
version = "1.11.0"
[[deps.LibGit2_jll]]
deps = ["Artifacts", "LibSSH2_jll", "Libdl", "MbedTLS_jll"]
uuid = "e37daf67-58a4-590a-8e99-b0245dd2ffc5"
version = "1.7.2+0"
[[deps.LibSSH2_jll]]
deps = ["Artifacts", "Libdl", "MbedTLS_jll"]
uuid = "29816b5a-b9ab-546f-933c-edad1886dfa8"
version = "1.11.0+1"
[[deps.Libdl]]
uuid = "8f399da3-3557-5675-b5ff-fb832c97cbdb"
version = "1.11.0"
[[deps.LinearAlgebra]]
deps = ["Libdl", "OpenBLAS_jll", "libblastrampoline_jll"]
uuid = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
version = "1.11.0"
[[deps.Logging]]
uuid = "56ddb016-857b-54e1-b83d-db4d58db5568"
version = "1.11.0"
[[deps.Markdown]]
deps = ["Base64"]
uuid = "d6f4376e-aef5-505a-96c1-9c027394607a"
version = "1.11.0"
[[deps.MbedTLS_jll]]
deps = ["Artifacts", "Libdl"]
uuid = "c8ffd9c3-330d-5841-b78e-0817d7145fa1"
version = "2.28.6+0"
[[deps.MozillaCACerts_jll]]
uuid = "14a3606d-f60d-562e-9121-12d972cd8159"
version = "2023.12.12"
[[deps.NetworkOptions]]
uuid = "ca575930-c2e3-43a9-ace4-1e988b2c1908"
version = "1.2.0"
[[deps.ObjectFile]]
deps = ["Reexport", "StructIO"]
git-tree-sha1 = "195e0a19842f678dd3473ceafbe9d82dfacc583c"
uuid = "d8793406-e978-5875-9003-1fc021f44a92"
version = "0.4.1"
[[deps.OpenBLAS_jll]]
deps = ["Artifacts", "CompilerSupportLibraries_jll", "Libdl"]
uuid = "4536629a-c528-5b80-bd46-f80d51c5b363"
version = "0.3.27+1"
[[deps.Pkg]]
deps = ["Artifacts", "Dates", "Downloads", "FileWatching", "LibGit2", "Libdl", "Logging", "Markdown", "Printf", "Random", "SHA", "TOML", "Tar", "UUIDs", "p7zip_jll"]
uuid = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
version = "1.11.0"
[deps.Pkg.extensions]
REPLExt = "REPL"
[deps.Pkg.weakdeps]
REPL = "3fa0cd96-eef1-5676-8a61-b3b8758bbffb"
[[deps.PrecompileTools]]
deps = ["Preferences"]
git-tree-sha1 = "5aa36f7049a63a1528fe8f7c3f2113413ffd4e1f"
uuid = "aea7be01-6a6a-4083-8856-8a6e6704d82a"
version = "1.2.1"
[[deps.Preferences]]
deps = ["TOML"]
git-tree-sha1 = "9306f6085165d270f7e3db02af26a400d580f5c6"
uuid = "21216c6a-2e73-6563-6e65-726566657250"
version = "1.4.3"
[[deps.Printf]]
deps = ["Unicode"]
uuid = "de0858da-6303-5e67-8744-51eddeeeb8d7"
version = "1.11.0"
[[deps.Random]]
deps = ["SHA"]
uuid = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
version = "1.11.0"
[[deps.Reexport]]
git-tree-sha1 = "45e428421666073eab6f2da5c9d310d99bb12f9b"
uuid = "189a3867-3050-52da-a836-e630ba90ab69"
version = "1.2.2"
[[deps.Requires]]
deps = ["UUIDs"]
git-tree-sha1 = "838a3a4188e2ded87a4f9f184b4b0d78a1e91cb7"
uuid = "ae029012-a4dd-5104-9daa-d747884805df"
version = "1.3.0"
[[deps.SHA]]
uuid = "ea8e919c-243c-51af-8825-aaa63cd721ce"
version = "0.7.0"
[[deps.Scratch]]
deps = ["Dates"]
git-tree-sha1 = "3bac05bc7e74a75fd9cba4295cde4045d9fe2386"
uuid = "6c6a2e73-6563-6170-7368-637461726353"
version = "1.2.1"
[[deps.Serialization]]
uuid = "9e88b42a-f829-5b0c-bbe9-9e923198166b"
version = "1.11.0"
[[deps.StructIO]]
deps = ["Test"]
git-tree-sha1 = "010dc73c7146869c042b49adcdb6bf528c12e859"
uuid = "53d494c1-5632-5724-8f4c-31dff12d585f"
version = "0.3.0"
[[deps.TOML]]
deps = ["Dates"]
uuid = "fa267f1f-6049-4f14-aa54-33bafae1ed76"
version = "1.0.3"
[[deps.Tar]]
deps = ["ArgTools", "SHA"]
uuid = "a4e569a6-e804-4fa4-b0f3-eef7a1d5b13e"
version = "1.10.0"
[[deps.Test]]
deps = ["InteractiveUtils", "Logging", "Random", "Serialization"]
uuid = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
version = "1.11.0"
[[deps.TimerOutputs]]
deps = ["ExprTools", "Printf"]
git-tree-sha1 = "5a13ae8a41237cff5ecf34f73eb1b8f42fff6531"
uuid = "a759f4b9-e2f1-59dc-863e-4aeb61b1ea8f"
version = "0.5.24"
[[deps.UUIDs]]
deps = ["Random", "SHA"]
uuid = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"
version = "1.11.0"
[[deps.Unicode]]
uuid = "4ec0a83e-493e-50e2-b9ac-8f72acf5a8f5"
version = "1.11.0"
[[deps.Zlib_jll]]
deps = ["Libdl"]
uuid = "83775a58-1f1d-513f-b197-d71354ab007a"
version = "1.2.13+1"
[[deps.libblastrampoline_jll]]
deps = ["Artifacts", "Libdl"]
uuid = "8e850b90-86db-534c-a0d3-1478176c7d93"
version = "5.8.0+1"
[[deps.nghttp2_jll]]
deps = ["Artifacts", "Libdl"]
uuid = "8e850ede-7688-5339-a07c-302acd2aaf8d"
version = "1.59.0+0"
[[deps.p7zip_jll]]
deps = ["Artifacts", "Libdl"]
uuid = "3f19e933-33d8-53b3-aaab-bd5110c3b7a0"
version = "17.4.0+2"
Installing the latest release of GPUCompiler I get a different error:
Error
ERROR: UndefVarError: `PassBuilder` not defined in `Enzyme.Compiler`
Suggestion: check for spelling errors or missing imports.
Stacktrace:
[1] macro expansion
@ ~/.julia/packages/LLVM/5DlHM/src/base.jl:96 [inlined]
[2] (::Enzyme.Compiler.var"#prop_julia_addr#28202"{LLVM.TargetMachine})(f::LLVM.Function)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler/optimize.jl:75
[3] function_pass_callback(ptr::Ptr{Nothing}, data::Ptr{Nothing})
@ LLVM ~/.julia/packages/LLVM/5DlHM/src/pass.jl:49
[4] LLVMRunPassManager
@ ~/.julia/packages/LLVM/5DlHM/lib/16/libLLVM.jl:3351 [inlined]
[5] run!
@ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:39 [inlined]
[6] (::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine})(pm::LLVM.ModulePassManager)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler/optimize.jl:2029
[7] LLVM.ModulePassManager(::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine}; kwargs::@Kwargs{})
@ LLVM ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:33
[8] ModulePassManager
@ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:30 [inlined]
[9] optimize!
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler/optimize.jl:1951 [inlined]
[10] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:5807
[11] codegen
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:5208 [inlined]
[12] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6710
[13] _thunk
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6710 [inlined]
[14] cached_compilation
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6748 [inlined]
[15] thunkbase(ctx::LLVM.Context, mi::Core.MethodInstance, ::Val{0x0000000000006819}, ::Type{Const{…}}, ::Type{Active}, tt::Type{Tuple{…}}, ::Val{Enzyme.API.DEM_ReverseModeCombined}, ::Val{1}, ::Val{(false, false)}, ::Val{false}, ::Val{false}, ::Type{FFIABI})
@ Enzyme.Compiler ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6821
[16] #s2021#28415
@ ~/.julia/packages/Enzyme/YDcYf/src/compiler.jl:6873 [inlined]
[17] var"#s2021#28415"(FA::Any, A::Any, TT::Any, Mode::Any, ModifiedBetween::Any, width::Any, ReturnPrimal::Any, ShadowInit::Any, World::Any, ABI::Any, ::Any, ::Any, ::Any, ::Any, tt::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any)
@ Enzyme.Compiler ./none:0
[18] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core ./boot.jl:709
[19] autodiff
@ ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:309 [inlined]
[20] autodiff
@ ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:326 [inlined]
[21] gradient(rm::ReverseMode{false, FFIABI, false}, f::typeof(sum), x::Vector{Float32})
@ Enzyme ~/.julia/packages/Enzyme/YDcYf/src/Enzyme.jl:1027
[22] top-level scope
@ REPL[3]:1
[23] top-level scope
@ none:1
Some type information was truncated. Use `show(err)` to see complete types.
That said enzyme definitely has some older GPUCompiler in compat which shouldn't be there. Currently installing AMDGPU (which installs GPUCompiler 0.26.5) causes enzyme to not precompile https://buildkite.com/julialang/luxlib-dot-jl/builds/835#0190d6c1-141c-4f6b-ab1d-eec8c2e4f7bc/317-648
I'm getting the following error here
ERROR: LoadError: UndefVarError: `PassBuilder` not defined in `Enzyme.Compiler` Stacktrace: [1] macro expansion @ ~/.julia/packages/LLVM/5DlHM/src/base.jl:96 [inlined] [2] (::Enzyme.Compiler.var"#prop_julia_addr#28416"{LLVM.TargetMachine})(f::LLVM.Function) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:75 [3] function_pass_callback(ptr::Ptr{Nothing}, data::Ptr{Nothing}) @ LLVM ~/.julia/packages/LLVM/5DlHM/src/pass.jl:49 [4] LLVMRunPassManager @ ~/.julia/packages/LLVM/5DlHM/lib/16/libLLVM.jl:3351 [inlined] [5] run! @ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:39 [inlined] [6] (::Enzyme.Compiler.var"#28512#28513"{LLVM.Module, LLVM.TargetMachine})(pm::LLVM.ModulePassManager) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:2029 [7] LLVM.ModulePassManager(::Enzyme.Compiler.var"#28512#28513"{LLVM.Module, LLVM.TargetMachine}; kwargs::@Kwargs{}) @ LLVM ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:33 [8] ModulePassManager @ ~/.julia/packages/LLVM/5DlHM/src/passmanager.jl:30 [inlined] [9] optimize! @ ~/.julia/packages/Enzyme/Pljwm/src/compiler/optimize.jl:1951 [inlined] [10] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:5787 [11] codegen @ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:5194 [inlined] [12] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6682 [13] _thunk @ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6682 [inlined] [14] cached_compilation @ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6720 [inlined] [15] (::Enzyme.Compiler.var"#28633#28634"{Active, FFIABI, Const{typeof(loss_function)}, Enzyme.API.DEM_ReverseModeCombined, (false, false, false, false, false, false), true, false, Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}, 0x00000000000068d4, 1, Core.MethodInstance})(ctx::LLVM.Context) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6795 [16] JuliaContext(f::Enzyme.Compiler.var"#28633#28634"{Active, FFIABI, Const{typeof(loss_function)}, Enzyme.API.DEM_ReverseModeCombined, (false, false, false, false, false, false), true, false, Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}, 0x00000000000068d4, 1, Core.MethodInstance}; kwargs::@Kwargs{}) @ GPUCompiler ~/.julia/packages/GPUCompiler/Y4hSX/src/driver.jl:52 [17] JuliaContext @ ~/.julia/packages/GPUCompiler/Y4hSX/src/driver.jl:42 [inlined] [18] thunkbase(mi::Core.MethodInstance, ::Val{0x00000000000068d4}, ::Type{Const{typeof(loss_function)}}, ::Type{Active}, tt::Type{Tuple{Const{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}}, Const{Matrix{Float32}}, Const{OneHotMatrix{UInt32, Vector{UInt32}}}, Duplicated{@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}}, Const{@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}}}}, ::Val{Enzyme.API.DEM_ReverseModeCombined}, ::Val{1}, ::Val{(false, false, false, false, false, false)}, ::Val{true}, ::Val{false}, ::Type{FFIABI}) @ Enzyme.Compiler ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6740 [19] #s2021#28635 @ ~/.julia/packages/Enzyme/Pljwm/src/compiler.jl:6826 [inlined] [20] var"#s2021#28635"(FA::Any, A::Any, TT::Any, Mode::Any, ModifiedBetween::Any, width::Any, ReturnPrimal::Any, ShadowInit::Any, World::Any, ABI::Any, ::Any, ::Any, ::Any, ::Any, tt::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any) @ Enzyme.Compiler ./none:0 [21] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any}) @ Core ./boot.jl:709 [22] autodiff @ ~/.julia/packages/Enzyme/Pljwm/src/Enzyme.jl:309 [inlined] [23] autodiff @ ~/.julia/packages/Enzyme/Pljwm/src/Enzyme.jl:326 [inlined] [24] gradient_loss_function(model::Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(tanh_fast), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_3::WrappedFunction{:direct_call, typeof(softmax)}}, Nothing}, x::Matrix{Float32}, y::OneHotMatrix{UInt32, Vector{UInt32}}, ps::@NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}}) @ Main ~/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:65 [25] top-level scope @ ~/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:78 [26] include(fname::String) @ Main ./sysimg.jl:38 [27] top-level scope @ ~/work/Reactant.jl/Reactant.jl/test/runtests.jl:49 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 in expression starting at /home/runner/work/Reactant.jl/Reactant.jl/test/nn_lux.jl:78 in expression starting at /home/runner/work/Reactant.jl/Reactant.jl/test/runtests.jl:49 Package Reactant errored during testing
Same here with the latest release candidate of Julia 1.11:
julia> using Enzyme #v0.12.26
julia> function gradByEnzyme(f, inVal)
dp = zero(inVal)
Enzyme.autodiff(Reverse, f, Active, Duplicated(inVal, dp))
dp
end
gradByEnzyme (generic function with 1 method)
julia> gradByEnzyme(x->sum(x .^ 2), [1., 2., 3.])
ERROR: UndefVarError: `PassBuilder` not defined in `Enzyme.Compiler`
Suggestion: check for spelling errors or missing imports.
Stacktrace:
[1] macro expansion
@ C:\Users\frank\.julia\packages\LLVM\5DlHM\src\base.jl:96 [inlined]
[2] (::Enzyme.Compiler.var"#prop_julia_addr#28202"{LLVM.TargetMachine})(f::LLVM.Function)
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler\optimize.jl:75
[3] function_pass_callback(ptr::Ptr{Nothing}, data::Ptr{Nothing})
@ LLVM C:\Users\frank\.julia\packages\LLVM\5DlHM\src\pass.jl:49
[4] LLVMRunPassManager
@ C:\Users\frank\.julia\packages\LLVM\5DlHM\lib\16\libLLVM.jl:3351 [inlined]
[5] run!
@ C:\Users\frank\.julia\packages\LLVM\5DlHM\src\passmanager.jl:39 [inlined]
[6] (::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine})(pm::LLVM.ModulePassManager)
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler\optimize.jl:2033
[7] LLVM.ModulePassManager(::Enzyme.Compiler.var"#28298#28299"{LLVM.Module, LLVM.TargetMachine}; kwargs::@Kwargs{})
@ LLVM C:\Users\frank\.julia\packages\LLVM\5DlHM\src\passmanager.jl:33
[8] ModulePassManager
@ C:\Users\frank\.julia\packages\LLVM\5DlHM\src\passmanager.jl:30 [inlined]
[9] optimize!(mod::LLVM.Module, tm::LLVM.TargetMachine)
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler\optimize.jl:1955
[10] codegen(output::Symbol, job::GPUCompiler.CompilerJob{…}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:5968
[11] codegen
@ C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:5371 [inlined]
[12] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:6871
[13] _thunk
@ C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:6871 [inlined]
[14] cached_compilation
@ C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:6909 [inlined]
[15] thunkbase(ctx::LLVM.Context, mi::Core.MethodInstance, ::Val{…}, ::Type{…}, ::Type{…}, tt::Type{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Type{…})
@ Enzyme.Compiler C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:6982
[16] #s2043#28415
@ C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\compiler.jl:7034 [inlined]
[17]
@ Enzyme.Compiler .\none:0
[18] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core .\boot.jl:706
[19] autodiff(::ReverseMode{false, FFIABI, false}, f::Const{var"#1#2"}, ::Type{Active}, args::Duplicated{Vector{Float64}})
@ Enzyme C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\Enzyme.jl:309
[20] autodiff
@ C:\Users\frank\.julia\packages\Enzyme\r8mFE\src\Enzyme.jl:326 [inlined]
[21] gradByEnzyme(f::Function, inVal::Vector{Float64})
@ Main .\REPL[2]:3
[22] top-level scope
@ REPL[3]:1
Some type information was truncated. Use `show(err)` to see complete types.
System info:
Julia Version 1.11.0-rc2
Commit 34c3a63147 (2024-07-29 06:24 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: Windows (x86_64-w64-mingw32)
CPU: 18 × 12th Gen Intel(R) Core(TM) i9-12900HK
WORD_SIZE: 64
LLVM: libLLVM-16.0.6 (ORCJIT, alderlake)
Threads: 1 default, 0 interactive, 1 GC (on 18 virtual cores)
Present issue here is that LLVM.jl dropped support for API's which we need to support 1.11
x/ref https://github.com/maleadt/LLVM.jl/issues/435
cc @frankwswang @avik-pal @mofeing @vchuravy
I was wondering if there are any updates on 1.11 support, as the release draws nearer?
Various codes work now and no precompilation failures, but not all. Specifically support for the new gc_loaded intrinsic needs to be added, but I don't understand the semantics of it yet and need help from @gbaraldi and or @vtjnash to add.
If you understand the meaning of it well enough to explain it and/or support it, be my guest! But since it's a GC related thing and I don't want to accidentally cause segfaults, it remains as an error atm.
I don't know the first thing about that, just wanted to check if I could re-activate DI tests for Enzyme on v1.11. Guess I'll give it a try!