Olivier Cots
Olivier Cots
Here is a bench with a comparison with JuMP: https://github.com/ocots/JumpVSADNLP/ With `ADNLPModels` v0.6.2 we get: ```bash julia> include("bench.jl") Hessian of Lagrangian ADNLP 49.336 ms (337613 allocations: 78.45 MiB) JuMP 66.160...
@jbcaillau New results from @tmigot: Under the `main` branch of `ADNLPModels`: ```bash Hessian of Lagrangian ADNLP 4.755 ms (4370 allocations: 8.58 MiB) JuMP 82.407 μs (137 allocations: 24.08 KiB) Jacobian...
Thanks! We take a look at it to construct the same Jacobian and Hessian.
Actually, even for this simple problem, there is a bug: ```julia using OptimalControl using NLPModelsIpopt ocp = @def begin t ∈ [0, 1], time x ∈ R², state u ∈...
This for instance compiles but there is still an issue: ```julia module Goddard3D using OptimalControl using NLPModelsIpopt #using OrdinaryDiffEq # to get the Flow function from OptimalControl #using NonlinearSolve #...
@jbcaillau, @PierreMartinon Any idea?
It seems it is because `dot(f(x(t)),f(x(t)))` is 0 at some points and there is a problem with automatic diff when dealing with constant!
@jbcaillau @PierreMartinon
See - https://docs.github.com/en/get-started/exploring-projects-on-github/contributing-to-a-project - https://github.com/firstcontributions/first-contributions - https://docs.github.com/fr/communities/setting-up-your-project-for-healthy-contributions/setting-guidelines-for-repository-contributors
It could be nice to make a template.