MethodError: The applicable method may be too new: running in world age 33714, while current world is 33900
Hello,
I have been been facing an issue.
Depending on the uODE I am solving I am getting the error:
┌ Warning: EnzymeVJP tried and failed in the automated AD choice algorithm with the following error. (To turn off this printing, add `verbose = false` to the `solve` call)
└ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/NhfkF/src/concrete_solve.jl:21
MethodError: no method matching asprogress(::Base.CoreLogging.LogLevel, ::String, ::Module, ::Symbol, ::Symbol, ::String, ::Int64)
The applicable method may be too new: running in world age 33714, while current world is 33900.
Closest candidates are:
asprogress(::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any; progress, kwargs...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:156
asprogress(::Any, ::ProgressLogging.Progress, ::Any...; _...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:155
asprogress(::Any, ::ProgressLogging.ProgressString, ::Any...; _...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:200
and I do not understand why.
What is this world_age error and should I be worried?
I can turn it off with verbose=false, but I think it would be wiser to ask here.
Best Regards
Anyone that can help me with this?
MWE?
What is the meaning of MWE?
https://en.wikipedia.org/wiki/Minimal_reproducible_example
Here is a minimal reproducible example of this:
using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse
using Optimization, OptimizationOptimisers, OptimizationOptimJL
using LinearAlgebra, Statistics
using ComponentArrays, Lux, Zygote, Plots, StableRNGs
rng = StableRNG(1111)
function example!(du, u, p, t)
alpha = p[1];
du[1] = (-alpha*(0.9*u[1] - 0.45) + 0.45)*u[3];
du[2] = -alpha*(0.9*u[2]*u[3] - 0.225*u[5]) + 0.225*u[5];
du[3] = -alpha*(0.9*u[3] ^ 2 + 0.225*u[4] - 0.225) - 0.45* u[1] - 0.225* u[4] + 0.225;
du[4] = -0.9*alpha*u[3]*u[4];
du[5] = alpha*(0.225*u[2] - 0.9*u[3]*u[5]) - 0.225*u[2];
end
t_true= LinRange(0.0,5.0,300)
tspan = (0.0, 5.0)
u0 = [0.7, 0.1, -0.05, 0.2, 0.025];
p_ = [1.0];
prob = ODEProblem(example!, u0, tspan, p_)
solution = solve(prob, Vern7(), abstol = 1e-12, reltol = 1e-12, saveat = t_true)
Xₙ = Array(solution)
t = solution.t
U = Lux.Chain( Lux.Dense(5, 50, Lux.tanh),
Lux.Dense(50, 5))
p, st = Lux.setup(rng, U)
# Define the hybrid model
function ude_dynamics!(du, u, p, t, p_true)
û = U(u, p, st)[1]
alpha = p_true[1];
du[1] = alpha*(0.45*u[3] - 2*û[1]) + 0.45*u[3];
du[2] = alpha*(0.225*u[5] - 2*û[2]) + 0.225*u[5];
du[3] = -alpha*(0.225*u[4] + 2*û[3] - 0.225) - 0.45*u[1] - 0.225*u[4] + 0.225;
du[4] = -2*alpha*û[4];
du[5] = alpha*(0.225*u[2] - 2*û[5]) - 0.225*u[2];
end
# Closure with the known parameter
nn_dynamics!(du, u, p, t) = ude_dynamics!(du, u, p, t, p_)
# Define the problem
prob_nn = ODEProblem(nn_dynamics!, Xₙ[:, 1], tspan, p)
function predict(θ, X = Xₙ[:, 1], T = t)
_prob = remake(prob_nn, u0 = X, tspan = (T[1], T[end]), p = θ)
Array(solve(_prob, Tsit5(), saveat = T,
abstol = 1e-5, reltol = 1e-5))
end
function loss(θ)
X̂ = predict(θ)
mean(abs2, Xₙ .- X̂)
end
losses = Float64[]
callback = function (p, l)
println("loss is called")
push!(losses, l)
if length(losses) % 1 == 0
println("Current loss after $(length(losses)) iterations: $(losses[end])")
end
return false
end
adtype = Optimization.AutoZygote()
optf = Optimization.OptimizationFunction((x, p) -> loss(x), adtype)
optprob = Optimization.OptimizationProblem(optf, ComponentVector{Float64}(p))
res = Optimization.solve(optprob, ADAM(), callback = callback, maxiters = 10)
I cannot reproduce this.
Are you using a standard REPL? Are your packages all up to date?
Thanks for the input!
I am using REPL in vscode.
How can I install all of the the latest stable releases of the packages for SciML?
Edit: I when into package mode ] and typed up. Some of the packages went through an update. However, I still have a similar error.
Edit2: I open Julia inside the terminal and did the same thing as before (update packages). I got several updates and now I have an output like the one in your image. I guess I have duplicated packages?
Additionally, I am constantly getting:
┌ Warning: EnzymeVJP tried and failed in the automated AD choice algorithm with the following error. (To turn off this printing, add `verbose = false` to the `solve` call)
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/ETXYy/src/concrete_solve.jl:21
Enzyme execution failed.
Mismatched activity for: store i64 %9, i64 addrspace(10)* %8, align 8, !dbg !9, !tbaa !23 const val: %9 = load i64, i64 addrspace(11)* %7, align 8, !dbg !9, !tbaa !23
Type tree: {[-1]:Pointer, [-1,0]:Pointer, [-1,0,-1]:Float@double, [-1,8]:Integer, [-1,9]:Integer, [-1,10]:Integer, [-1,11]:Integer, [-1,12]:Integer, [-1,13]:Integer, [-1,14]:Integer, [-1,15]:Integer, [-1,16]:Integer, [-1,17]:Integer, [-1,18]:Integer, [-1,19]:Integer, [-1,20]:Integer, [-1,21]:Integer, [-1,22]:Integer, [-1,23]:Integer, [-1,24]:Integer, [-1,25]:Integer, [-1,26]:Integer, [-1,27]:Integer, [-1,28]:Integer, [-1,29]:Integer, [-1,30]:Integer, [-1,31]:Integer, [-1,32]:Integer, [-1,33]:Integer, [-1,34]:Integer, [-1,35]:Integer, [-1,36]:Integer, [-1,37]:Integer, [-1,38]:Integer, [-1,39]:Integer}
You may be using a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/#Activity-of-temporary-storage). If not, please open an issue, and either rewrite this variable to not be conditionally active or use Enzyme.API.runtimeActivity!(true) as a workaround for now
Is this relevant for the calculations or can I just set verbose = false to suppress the output?
What does ]st show you?
That line outputs: Status ~/Desktop/myCase/Project.toml (empty project)
Did you add the packages😅 ?
I can execute the script, so they must somewhere (Or should I do something?) [sorry if this is a newbish question. I am still new to this whole thing]
Edit:
So I added everything with:
Pkg.add(["Pacakge1"..."PackageN"])
Updated everything (]up)
Now, I am back at having the
┌ Warning: EnzymeVJP tried and failed in the automated AD choice algorithm with the following error. (To turn off this printing, add `verbose = false` to the `solve` call)
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/NhfkF/src/concrete_solve.jl:21
MethodError: no method matching asprogress(::Base.CoreLogging.LogLevel, ::String, ::Module, ::Symbol, ::Symbol, ::String, ::Int64)
The applicable method may be too new: running in world age 33718, while current world is 34053.
When I do ]st I get:
[b0b7db55] ComponentArrays v0.14.0
[2445eb08] DataDrivenDiffEq v1.2.0
[5b588203] DataDrivenSparse v0.1.2
[b2108857] Lux v0.5.0
[961ee093] ModelingToolkit v8.63.0
[7f7a1694] Optimization v3.15.2
[36348300] OptimizationOptimJL v0.1.9
[42dfb2eb] OptimizationOptimisers v0.1.5
[1dea7af3] OrdinaryDiffEq v6.53.4
[91a5bcdd] Plots v1.38.17
[1ed8b502] SciMLSensitivity v7.35.1
[860ef19b] StableRNGs v1.0.0
[e88e6eb3] Zygote v0.6.62
[37e2e46d] LinearAlgebra
[10745b16] Statistics v1.9.0
Try this in the REPL? Directly in the v1.9.2 REPL? How did you install julia, is it the standard binary from juliaup?
Hi,
So I am using Windows subsystem linux (WSL).
I went to https://julialang.org/downloads/ and downloaded the 64-bit (glibc) [Generic Linux on x86].
I created a folder called Julia in my home directory and unpacked the files there.
Afterwards, I added the folder to my PATH: export PATH="/home/myComputer/Julia/julia-1.9.2/bin:$PATH"
With this, I had Julia available in the terminal.
Next, opened vscode went to the Extensions page [crtl + shift + X] amd typed Julia.
I installed Julia v1.47.2- Julia Language Support
Went to my .jl script and Opened the REPL [Alt + J followed by Alt + O]
Added the required packages [previous post].
Executed the script (highlight everything and press shift + Enter)
As I am writing this post I get the message:
┌ Warning: EnzymeVJP tried and failed in the automated AD choice algorithm with the following error. (To turn off this printing, add `verbose = false` to the `solve` call)
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/NhfkF/src/concrete_solve.jl:21
MethodError: no method matching asprogress(::Base.CoreLogging.LogLevel, ::String, ::Module, ::Symbol, ::Symbol, ::String, ::Int64)
The applicable method may be too new: running in world age 33725, while current world is 33870.
Closest candidates are:
asprogress(::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any; progress, kwargs...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:156
asprogress(::Any, ::ProgressLogging.Progress, ::Any...; _...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:155
asprogress(::Any, ::ProgressLogging.ProgressString, ::Any...; _...) (method too new to be called from this world context.)
ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:200
@ChrisRackauckas What version of the packages are you using?
The error is linked to the ┌ Warning: EnzymeVJP tried and failed in the automated AD choice algorithm with the following error. (To turn off this printing, add `verbose = false` to the `solve` call). which appears in line 21 of concrete_solve.jl
If I just set the verbose=false, the warn message will not trigger. If I do this, can I safely carry on with the calculations?
Best Regards
Wait, are you talking about the Enzyme warning and not an error?
When I set verbose=false I get no enzyme warning followed by the
The applicable method may be too new: running in world age 33725, while current world is 33870
What I am asking is: if I set verbose=false will this have any implication to the calculations?
No. It has been silent for a very long time, it's just telling you that you're missing a potential performance optimization to fix compatibility with Enzyme.