NLPModelsJuMP.jl
NLPModelsJuMP.jl copied to clipboard
Quadratic constraints
@tmigot
Codecov Report
Merging #102 (3d7001c) into main (bb093af) will increase coverage by
0.07%. The diff coverage is96.21%.
:exclamation: Current head 3d7001c differs from pull request most recent head e141a9c. Consider uploading reports for the commit e141a9c to get more accurate results
@@ Coverage Diff @@
## main #102 +/- ##
==========================================
+ Coverage 95.00% 95.07% +0.07%
==========================================
Files 3 3
Lines 580 690 +110
==========================================
+ Hits 551 656 +105
- Misses 29 34 +5
| Impacted Files | Coverage Δ | |
|---|---|---|
| src/utils.jl | 97.23% <93.24%> (-1.67%) |
:arrow_down: |
| src/moi_nlp_model.jl | 100.00% <100.00%> (ø) |
|
| src/moi_nls_model.jl | 88.36% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact),ø = not affected,? = missing dataPowered by Codecov. Last update bb093af...e141a9c. Read the comment docs.
Hi @amontoison , I made some progress with this one. A list of things that we can discuss:
- [x] Improve implementation of
jac_nln_coord!https://github.com/amontoison/NLPModelsJuMP.jl/blob/3db790058ce86b88582e7ab740d40d1bd93ee369/src/moi_nlp_model.jl#L160 - [x] Improve implementation of
hprod!https://github.com/amontoison/NLPModelsJuMP.jl/blob/3db790058ce86b88582e7ab740d40d1bd93ee369/src/moi_nlp_model.jl#L407 - [x] Add more tests (note that I used
hs61which is implemented as adnlpmodels in https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/blob/main/src/ADNLPProblems/hs61.jl ) - [ ] Copy-paste to the
NLSvariant.
Right now my test was very basic, just testing that all the API returns something.
jump = hs61()
nlp = MathOptNLPModel(jump)
x1 = rand(nlp.meta.nvar)
obj(nlp, x1)
cons(nlp, x1)
jac_structure(nlp)
jac_coord(nlp, x1)
hess(nlp, x1)
hess(nlp, x1, rand(nlp.meta.ncon))
hprod(nlp, x1, x1)
We can also use the following script to test problems with quadratic constraints from OptimizationProblems.jl
using ADNLPModels, OptimizationProblems, NLPModels
"""
test_quadratic_constraints(sample_size = 100)
Return the list of problems with quadratic constraints
"""
function test_quadratic_constraints(sample_size = 10)
meta = OptimizationProblems.meta[!, :]
con_pb = meta[meta.ncon .> 0, :name]
sample_size = max(sample_size, 2)
list = []
for pb in con_pb
nlp = OptimizationProblems.ADNLPProblems.eval(Symbol(pb))()
std = similar(nlp.meta.x0)
blvar = similar(nlp.meta.lvar)
buvar = similar(nlp.meta.uvar)
for j=1:nlp.meta.nvar
blvar[j] = nlp.meta.lvar[j] == -Inf ? -10. : nlp.meta.lvar[j]
buvar[j] = nlp.meta.uvar[j] == Inf ? 10. : nlp.meta.uvar[j]
std[j] = max(abs(blvar[j]), abs(buvar[j]))
end
Iref = collect(1:nlp.meta.ncon)
for k in Iref
if k in nlp.meta.lin # substract linear constraints
Iref[k] = -1
continue
end
y0 = zeros(nlp.meta.ncon)
y0[k] = 1.0
ref = hess(nlp, nlp.meta.x0, y0, obj_weight = 0.0)
for i=1:sample_size
x = min.(max.((2 * rand(nlp.meta.nvar) .- 1) .* std, blvar), buvar)
Hx = hess(nlp, x, y0, obj_weight = 0.0)
if Hx != ref
Iref[k] = -1
break
end
end
end
if findall(x -> x > 0, Iref) != []
push!(list, (nlp.meta.name, Iref[findall(x -> x > 0, Iref)]))
end
end
return list
end
test_quadratic_constraints()
@amontoison I am checking with the other problems in OptimizationProblems.jl, but it looks good to me.
I am done testing the OptimizationProblems.jl ; I added 45 problems with quadratic constraints https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/tree/add-quadratic and no error found! So, it's good for me. Up to you.
Superseded by #181