NLPModelsJuMP.jl icon indicating copy to clipboard operation
NLPModelsJuMP.jl copied to clipboard

Consider using multi objective from Jump for NLS

Open abelsiqueira opened this issue 2 years ago • 4 comments
trafficstars

See package MultiObjectiveAlgorithms.jl

abelsiqueira avatar Oct 05 '23 09:10 abelsiqueira

We discussed about that during JuMP-dev 2024 with @blegat.

amontoison avatar Jul 21 '24 20:07 amontoison

We converged to the following suggestion. Make the F argument here optional https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl/blob/9b8128c3cee936ee55120bced35817875ea2027d/src/moi_nls_model.jl#L26 and the MOI wrapper won't set this F. If this argument F is not given, it can be recovered from the objective if it is a ScalarNonlinearFunction if the root node is + and the node of each children is ^ with second argument being 2. If the objective is not nonlinear or if it's not of that form then a nice error message will explain what's the issue and recommend using JuMP.@force_nonlinear. See examples below:

julia> @objective(model, Min, (x^2 + 1)^2 + (x^3 + 1)^2)
((x² + 1) ^ 2.0) + (((x ^ 3) + 1.0) ^ 2.0)

julia> @objective(model, Min, (x + 1)^2 + (x + 1)^2) # Not what we want
2 x² + 4 x + 2

julia> @objective(model, Min, @force_nonlinear((x + 1)^2 + (x + 1)^2)) # `@force_nonlinear` saves the day
((x + 1) ^ 2) + ((x + 1) ^ 2)

julia> y = [x + 1, x - 1]
2-element Vector{AffExpr}:
 x + 1
 x - 1

julia> @objective(model, Min, sum(y[i]^2 for i in eachindex(y))) # Note what we want
2 x² + 0 x + 2

julia> @objective(model, Min, sum(@force_nonlinear(y[i]^2) for i in eachindex(y))) # `@force_nonlinear` saves the day
((x + 1) ^ 2) + ((x - 1) ^ 2)

What do you think ?

blegat avatar Jul 22 '24 15:07 blegat

The issue of multiple objective is that this wouldn't solver-independent: If you want to compare with a solver that's not least-square, you will need to use the sum of squares and if you want to use a least square solver you need a multiple objective. With the suggestion from the comment above, the same model can be used for both.

blegat avatar Jul 22 '24 15:07 blegat

I like your idea @blegat, it will be easy recover the term F(x) from ||F(x)||_2. It's what we want for RipQP.jl and CaNNOLeS.jl.

For example, one special application of RipQP is constrained linear least-squares problems and we want to recover the terms A, B, b, c to internally exploit the structure and solve a more relevant optimization problem:

image

If MOI pre-digests the objective, we will only be able to recover A'A and A'b, which is not what we want, and it could be dense if only one column of the sparse matrix A is dense!

amontoison avatar Jul 22 '24 16:07 amontoison