Optimization.jl icon indicating copy to clipboard operation
Optimization.jl copied to clipboard

Optimization.LBFGS() can not compute the gradient

Open enigne opened this issue 9 months ago • 11 comments

Hi, I'm using Optimization.jl in my package: DJUICE.jl to optimize a cost function. The example is here After optimization, the solution is the same as my initial guess. I computed the gradient directly using Enzyme here

But, when comparing with the evaluation from Optimization.jl sol.cache.f.grad(∂J_∂α, prob.u0, prob.p) I got error messages

enigne avatar May 14 '24 18:05 enigne

https://github.com/SciML/OptimizationBase.jl/pull/43 is the solution. Maybe try that branch and see?

ChrisRackauckas avatar May 14 '24 18:05 ChrisRackauckas

Thanks, Chris. I got an error:

ERROR: LoadError: Function to differentiate `MethodInstance for OptimizationEnzymeExt.firstapply(::typeof(DJUICE.costfunction), ::Vector{Float64}, ::DJUICE.FemModel, ::DJUICE.FemModel)` is guaranteed to return an error and doesn't make sense to autodiff. Giving up
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] macro expansion
   @ ~/.julia/packages/Enzyme/srACB/src/compiler.jl:5845 [inlined]
 [3] macro expansion
   @ ./none:0 [inlined]
 [4] thunk(::Val{…}, ::Type{…}, ::Type{…}, tt::Type{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Type{…})
   @ Enzyme.Compiler ./none:0
 [5] autodiff
   @ ~/.julia/packages/Enzyme/srACB/src/Enzyme.jl:234 [inlined]
 [6] (::OptimizationEnzymeExt.var"#grad#50"{OptimizationFunction{…}, DJUICE.FemModel})(res::Vector{Float64}, θ::Vector{Float64}, args::DJUICE.FemModel)
   @ OptimizationEnzymeExt ~/.julia/packages/OptimizationBase/kgHps/ext/OptimizationEnzymeExt.jl:165
 [7] top-level scope
   @ ~/Dartmouth/DJUICE/test/testoptimization.jl:46
 [8] include(fname::String)
   @ Base.MainInclude ./client.jl:489
 [9] top-level scope
   @ REPL[4]:1
in expression starting at /Users/gongcheng/Dartmouth/DJUICE/test/testoptimization.jl:46

The followings are the packages I'm uisng:

  [7da242da] Enzyme v0.12.6
  [7f7a1694] Optimization v3.25.0
  [bca83a33] OptimizationBase v0.0.7 `https://github.com/SciML/OptimizationBase.jl#ChrisRackauckas-patch-1`
  [36348300] OptimizationOptimJL v0.2.3

enigne avatar May 14 '24 18:05 enigne

Well that's progress. Why DJUICE.FemModel is doubled is a good question.

ChrisRackauckas avatar May 14 '24 18:05 ChrisRackauckas

Interesting. Enzyme.autodiff(Enzyme.Reverse, DJUICE.costfunction, Duplicated(α, ∂J_∂α), Duplicated(femmodel,dfemmodel)) works

sol.cache.f.f(prob.u0, prob.p) also works for me.

This error only occurs when I call sol.cache.f.grad(∂J_∂α, prob.u0, prob.p)

And, the solution from optimization did not change.

enigne avatar May 14 '24 18:05 enigne

I think the next step here in improving Enzyme support is finishing up the DifferentiationInterface integration. We're working with @gdalle on this, I'm thinking it may not take more than 2 more weeks. When that's the case, DI will be used as the AD system within Optimization.jl. That means isolating this bug is simpler, as it isolates it to how DI handles Enzyme, which should be improved. If there is still an error, this becomes an Enzyme+DI issue which is something we can solve there.

ChrisRackauckas avatar May 26 '24 15:05 ChrisRackauckas

Can you try with newer versions? Without a MWE this won't be possible to be worked on

Vaibhavdixit02 avatar Oct 04 '24 17:10 Vaibhavdixit02

I think the next step here in improving Enzyme support is finishing up the DifferentiationInterface integration. We're working with @gdalle on this, I'm thinking it may not take more than 2 more weeks.

So that aged well :rofl:

gdalle avatar Oct 04 '24 17:10 gdalle

Hi @Vaibhavdixit02, I tested the newer version with the example in DJUICE.jl/test/testoptimization.jl, got an error as below. However, as you can see in this example, Enzyme autodiff works without an error.

ERROR: LoadError: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Activity-of-temporary-storage).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set Enzyme.API.runtimeActivity!(true) immediately after loading Enzyme (which maintains correctness, but may slightly reduce performance).
Mismatched activity for:   %value_phi21 = phi {} addrspace(10)* [ %arrayref, %pass ], [ %48, %L72 ] const val:   %arrayref = load {} addrspace(10)*, {} addrspace(10)* addrspace(13)* %47, align 8, !dbg !438, !tbaa !441, !alias.scope !350, !noalias !353
Type tree: {[-1]:Pointer, [-1,4]:Integer, [-1,5]:Integer, [-1,6]:Integer, [-1,7]:Integer, [-1,8]:Pointer, [-1,8,0]:Pointer, [-1,8,8]:Integer, [-1,8,9]:Integer, [-1,8,10]:Integer, [-1,8,11]:Integer, [-1,8,12]:Integer, [-1,8,13]:Integer, [-1,8,14]:Integer, [-1,8,15]:Integer}
 llvalue=  %arrayref = load {} addrspace(10)*, {} addrspace(10)* addrspace(13)* %47, align 8, !dbg !438, !tbaa !441, !alias.scope !350, !noalias !353

Stacktrace:
 [1] getproperty
   @ ./Base.jl:37
 [2] SetTriaInput
   @ ~/Dartmouth/DJUICE/src/core/inputs.jl:182

Stacktrace:
  [1] SetTriaInput
    @ ~/Dartmouth/DJUICE/src/core/inputs.jl:180
  [2] InputUpdateFromVector
    @ ~/Dartmouth/DJUICE/src/core/elements.jl:469
  [3] InputUpdateFromVectorx
    @ ~/Dartmouth/DJUICE/src/core/modules.jl:287 [inlined]
  [4] CostFunctionx
    @ ~/Dartmouth/DJUICE/src/core/control.jl:36 [inlined]
  [5] augmented_julia_CostFunctionx_6947wrap
    @ ~/Dartmouth/DJUICE/src/core/control.jl:0
  [6] macro expansion
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7187 [inlined]
  [7] enzyme_call
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6794 [inlined]
  [8] AugmentedForwardThunk
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6682 [inlined]
  [9] runtime_generic_augfwd(activity::Type{…}, width::Val{…}, ModifiedBetween::Val{…}, RT::Val{…}, f::typeof(DJUICE.CostFunctionx), df::Nothing, primal_1::DJUICE.FemModel, shadow_1_1::Nothing, primal_2::Vector{…}, shadow_2_1::Vector{…}, primal_3::DJUICE.IssmEnum, shadow_3_1::Nothing, primal_4::DJUICE.IssmEnum, shadow_4_1::Nothing, primal_5::Vector{…}, shadow_5_1::Nothing, primal_6::Val{…}, shadow_6_1::Nothing)
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/rules/jitrules.jl:338
 [10] costfunction
    @ ~/Dartmouth/DJUICE/src/core/control.jl:79
 [11] firstapply
    @ ~/.julia/packages/Optimization/fPVIW/ext/OptimizationEnzymeExt.jl:10 [inlined]
 [12] augmented_julia_firstapply_4098wrap
    @ ~/.julia/packages/Optimization/fPVIW/ext/OptimizationEnzymeExt.jl:0
 [13] macro expansion
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7187 [inlined]
 [14] enzyme_call
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6794 [inlined]
 [15] AugmentedForwardThunk
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6682 [inlined]
 [16] autodiff
    @ ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:264 [inlined]
 [17] (::OptimizationEnzymeExt.var"#grad#50"{OptimizationFunction{…}, DJUICE.FemModel})(::Vector{Float64}, ::Vector{Float64})
    @ OptimizationEnzymeExt ~/.julia/packages/Optimization/fPVIW/ext/OptimizationEnzymeExt.jl:165
 [18] (::OptimizationOptimJL.var"#8#14"{OptimizationCache{…}, OptimizationOptimJL.var"#7#13"{…}})(G::Vector{Float64}, θ::Vector{Float64})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/r5j6D/src/OptimizationOptimJL.jl:160
 [19] value_gradient!!(obj::TwiceDifferentiable{Float64, Vector{Float64}, Matrix{Float64}, Vector{Float64}}, x::Vector{Float64})
    @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
 [20] initial_state(method::LBFGS{…}, options::Optim.Options{…}, d::TwiceDifferentiable{…}, initial_x::Vector{…})
    @ Optim ~/.julia/packages/Optim/V8ZEC/src/multivariate/solvers/first_order/l_bfgs.jl:164
 [21] optimize(d::TwiceDifferentiable{…}, initial_x::Vector{…}, method::LBFGS{…}, options::Optim.Options{…})
    @ Optim ~/.julia/packages/Optim/V8ZEC/src/multivariate/optimize/optimize.jl:36
 [22] __solve(cache::OptimizationCache{OptimizationFunction{…}, Optimization.ReInitCache{…}, Nothing, Nothing, Nothing, Nothing, Nothing, LBFGS{…}, Base.Iterators.Cycle{…}, Bool, OptimizationOptimJL.var"#3#5"})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/r5j6D/src/OptimizationOptimJL.jl:209
 [23] solve!(cache::OptimizationCache{OptimizationFunction{…}, Optimization.ReInitCache{…}, Nothing, Nothing, Nothing, Nothing, Nothing, LBFGS{…}, Base.Iterators.Cycle{…}, Bool, OptimizationOptimJL.var"#3#5"})
    @ SciMLBase ~/.julia/packages/SciMLBase/ynHlA/src/solve.jl:177
 [24] solve(::OptimizationProblem{…}, ::LBFGS{…}; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/ynHlA/src/solve.jl:94
 [25] solve(::OptimizationProblem{…}, ::LBFGS{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/ynHlA/src/solve.jl:91
 [26] top-level scope
    @ ~/Dartmouth/DJUICE/test/testoptimization.jl:40
 [27] include(fname::String)
    @ Base.MainInclude ./client.jl:489
 [28] top-level scope
    @ REPL[52]:1
in expression starting at /Users/gongcheng/Dartmouth/DJUICE/test/testoptimization.jl:40
Some type information was truncated. Use `show(err)` to see complete types.

enigne avatar Oct 04 '24 17:10 enigne

Was this completed?

ChrisRackauckas avatar Oct 12 '24 20:10 ChrisRackauckas

Does your function contain caches?

gdalle avatar Oct 12 '24 20:10 gdalle

Well, I get the same error with the pure Enzyme call

julia> autodiff(Enzyme.Reverse, DJUICE.costfunction, Active, Duplicated(α, ∂J_∂α), Duplicated(femmodel,dfemmodel))
ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.
Mismatched activity for:   store {} addrspace(10)* %17, {} addrspace(10)* addrspace(10)* %.fca.0.gep, align 8, !dbg !45, !noalias !52 const val:   %17 = call fastcc nonnull {} addrspace(10)* @julia_FindParam_6345({} addrspace(10)* %getfield3) #30, !dbg !44
Type tree: {[-1]:Pointer, [-1,0]:Pointer, [-1,0,0]:Pointer, [-1,8]:Integer, [-1,9]:Integer, [-1,10]:Integer, [-1,11]:Integer, [-1,12]:Integer, [-1,13]:Integer, [-1,14]:Integer, [-1,15]:Integer, [-1,16]:Integer, [-1,17]:Integer, [-1,18]:Integer, [-1,19]:Integer, [-1,20]:Integer, [-1,21]:Integer, [-1,22]:Integer, [-1,23]:Integer, [-1,24]:Integer, [-1,25]:Integer, [-1,26]:Integer, [-1,27]:Integer, [-1,28]:Integer, [-1,29]:Integer, [-1,30]:Integer, [-1,31]:Integer, [-1,32]:Integer, [-1,33]:Integer, [-1,34]:Integer, [-1,35]:Integer, [-1,36]:Integer, [-1,37]:Integer, [-1,38]:Integer, [-1,39]:Integer}
 llvalue=  %17 = call fastcc nonnull {} addrspace(10)* @julia_FindParam_6345({} addrspace(10)* %getfield3) #30, !dbg !44

Stacktrace:
 [1] collect_similar
   @ ./array.jl:763
 [2] map
   @ ./abstractarray.jl:3285
 [3] costfunction
   @ ~/dJUICE.jl/src/core/control.jl:74

Stacktrace:
  [1] collect_similar
    @ ./array.jl:763 [inlined]
  [2] map
    @ ./abstractarray.jl:3285 [inlined]
  [3] costfunction
    @ ~/dJUICE.jl/src/core/control.jl:74 [inlined]
  [4] augmented_julia_costfunction_6319wrap
    @ ~/dJUICE.jl/src/core/control.jl:0
  [5] macro expansion
    @ ~/.julia/packages/Enzyme/Vjlrr/src/compiler.jl:8839 [inlined]
  [6] enzyme_call
    @ ~/.julia/packages/Enzyme/Vjlrr/src/compiler.jl:8405 [inlined]
  [7] AugmentedForwardThunk
    @ ~/.julia/packages/Enzyme/Vjlrr/src/compiler.jl:8242 [inlined]
  [8] autodiff
    @ ~/.julia/packages/Enzyme/Vjlrr/src/Enzyme.jl:384 [inlined]
  [9] autodiff(::ReverseMode{…}, ::typeof(DJUICE.costfunction), ::Type{…}, ::Duplicated{…}, ::Duplicated{…})
    @ Enzyme ~/.julia/packages/Enzyme/Vjlrr/src/Enzyme.jl:512
 [10] top-level scope
    @ ~/dJUICE.jl/test/testoptimization.jl:34

You do have Enzyme bounded to v0.12 in DJUICE.jl and I bumped it to use the latest versions to ensure I am matching the same things

Vaibhavdixit02 avatar Oct 12 '24 22:10 Vaibhavdixit02