OptimizationProblems.jl icon indicating copy to clipboard operation
OptimizationProblems.jl copied to clipboard

add `ADNLSModel` constructor in `arglina`

Open tmigot opened this issue 3 years ago • 2 comments

This is an example of how we could use ADNLSModel for the least square objective.

Currently, the tests break because in the JuMP models currently implemented we generally don't have the 1/2 factor in front of the objective. #162

@abelsiqueira @dpo Any opinion on this?

tmigot avatar Jul 11 '22 12:07 tmigot

Codecov Report

Patch coverage: 100.00% and no project coverage change.

Comparison is base (040a34c) 99.83% compared to head (897c4b7) 99.83%.

:exclamation: Current head 897c4b7 differs from pull request most recent head 8cdf744. Consider uploading reports for the commit 8cdf744 to get more accurate results

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #200   +/-   ##
=======================================
  Coverage   99.83%   99.83%           
=======================================
  Files         793      793           
  Lines        7260     7275   +15     
=======================================
+ Hits         7248     7263   +15     
  Misses         12       12           
Impacted Files Coverage Δ
src/PureJuMP/arglina.jl 100.00% <ø> (ø)
src/ADNLPProblems/arglina.jl 100.00% <100.00%> (ø)
src/Meta/arglina.jl 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

codecov[bot] avatar Jul 14 '22 08:07 codecov[bot]

@dpo Here is an example of the benefit of having both in-place and out-place residual with the same name, so that we can change the backend effortlessly.

using ADNLPModels, NLPModels, ReverseDiff
function arglina(; n::Int = default_nvar, type::Val{T} = Val(Float64), kwargs...) where {T}
  function F(r, x)
    m = 2 * n
    for i=1:n
      r[i] = x[i] - T(2 / m) * sum(x[j] for j = 1:n) - 1
      r[i + n] = -T(2 / m) * sum(x[j] for j = 1:n) - 1
    end
    return r
  end
  function F(x)
    r = similar(x, 2 * n)
    return F(r, x)
  end
  x0 = ones(T, n)
  return ADNLPModels.ADNLSModel(F, x0, 2 * n, name = "arglina"; kwargs...)
end

nlp = arglina(n = 10)
F = nlp.F
output = typeof(nlp.meta.x0)(undef, nlp.nls_meta.nequ)
input = nlp.meta.x0

# Use ForwardDiff on x -> nlp.F(x)
jac_residual(nlp, input)
@show @allocated jac_residual(nlp, input) # 6528

# Use ReverseDiff on (r, x) -> nlp.F(r, x)
cfJ = ReverseDiff.JacobianTape(nlp.F, output, input)
ReverseDiff.jacobian!(cfJ, input)
@show @allocated ReverseDiff.jacobian!(cfJ, input) # 1808 !!

# Use ReverseDiff on (r, x) -> nlp.F(r, x) and pre-allocate the result
result = zeros(20, 10)
ReverseDiff.jacobian!(result, cfJ, input)
@show @allocated ReverseDiff.jacobian!(result, cfJ, input) # 0

tmigot avatar Aug 11 '22 17:08 tmigot

Hi @tmigot. This is quite old and I forget what you were trying to explain. I don't see anything in the example above that would be complicated if the in-place function were called F!. What am I missing?

dpo avatar Jan 26 '23 14:01 dpo

The issue is that the function returns

> ADNLPModels.ADNLSModel(F, x0, 2 * n, name = "arglina"; kwargs...)

so that wouldn't work if we have an F and an F!.

tmigot avatar Jan 26 '23 14:01 tmigot

I wouldn't merge it anyway because right now calling obj for a NLS allocates, while we were trying to make some efforts along this direction in here https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/pull/241

tmigot avatar Jan 26 '23 14:01 tmigot