Dualization.jl icon indicating copy to clipboard operation
Dualization.jl copied to clipboard

Support feasibility sense

Open blegat opened this issue 1 year ago • 0 comments

Came up in https://github.com/jump-dev/SumOfSquares.jl/pull/283

blegat avatar Mar 23 '23 13:03 blegat

Smaller example is

julia> using JuMP, Ipopt

julia> import ParametricOptInterface as POI

julia> function main()
           model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
           @variable(model, z in Parameter(0))
           @variable(model, x)
           @NLobjective(model, Min, x)
           optimize!(model)
       end
main (generic function with 1 method)

julia> main()
This is Ipopt version 3.14.14, running with linear solver MUMPS 5.6.2.

Number of nonzeros in equality constraint Jacobian...:        0
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:        0

ERROR: BoundsError: attempt to access 1-element Vector{Float64} at index [2]
Stacktrace:
  [1] getindex
    @ ./essentials.jl:13 [inlined]
  [2] _forward_eval(f::MathOptInterface.Nonlinear.ReverseAD._FunctionStorage, d::MathOptInterface.Nonlinear.ReverseAD.NLPEvaluator, x::Vector{…})

This looks like a mismatch caused by shuffling the variable indices with @NL. The examples in the tests worked because they added the parameters after the decision variables (accidentally? intentionally?).

I don't think there is a simple fix for this. The NLPBlock is pretty fixed at assuming a given order of the variables.

But you can switch to the new nonlinear interface:

julia> function main_expr()
           model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, x + y >= z)
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           return
       end
main_expr (generic function with 1 method)

julia> main_expr()
objective_value(model) = 99.99999800270126
objective_value(model) = 3.999999926847757

but if you're using Ipopt, it supports Parameter directly

julia> function main_ipopt()
           model = Model(Ipopt.Optimizer)
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, x + y >= z)
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           return
       end
main_ipopt (generic function with 1 method)

julia> main_ipopt()
objective_value(model) = 99.99999980286509
objective_value(model) = 3.999999962453093

odow avatar Feb 28 '24 03:02 odow

Perhaps POI should error if it detects a NLPBlock.

odow avatar Feb 28 '24 03:02 odow

That makes sense.

Without using POI I also found another error when using NLPBlock MOI.Parameter and Ipopt (which could come from the same cause). Using parameters on the rhs gives a drastically different solution even when its value is just one times the original rhs. With the new API this error also disappears.

The minimum working case is a bit convoluted but simple to run. We just need PowerModels 0.19 (which still used the old NLP API ):

using JuMP
using PowerModels # v0.19.0 for old NLP API
using PGLib
using Ipopt


optimizer = Ipopt.Optimizer

matpower_case_name = "6468_rte" # pglib_opf_case5_pjm 6468_rte
network_data = make_basic_network(pglib(matpower_case_name))

function jump(network_data)
    # The problem to iterate over
    model = JuMP.Model(optimizer)

    pm = instantiate_model(
        network_data,
        ACPPowerModel,
        PowerModels.build_opf;
        setting=Dict("output" => Dict("branch_flows" => true, "duals" => true)),
        jump_model=model,
    )

    JuMP.optimize!(model)

    @info "Status" JuMP.termination_status(model)
    return JuMP.objective_value(model)
end

function jump_with_parameter(network_data)
    # The problem to iterate over
    model = JuMP.Model(optimizer)

    # Save original load value and Link POI
    num_loads = length(network_data["load"])

    @variable(model, load_scaler[i=1:num_loads] in MOI.Parameter.(1.0))

    for (str_i, l) in network_data["load"]
        i = parse(Int, str_i)
        l["pd"] = load_scaler[i] * l["pd"]
        l["qd"] = load_scaler[i] * l["qd"]
    end

    pm = instantiate_model(
        network_data,
        ACPPowerModel,
        PowerModels.build_opf;
        setting=Dict("output" => Dict("branch_flows" => true, "duals" => true)),
        jump_model=model,
    )

    JuMP.optimize!(model)

    @info "Status" JuMP.termination_status(model)
    return JuMP.objective_value(model)
end

objective_normal_jump = jump(network_data) # 2.05e6

objective_parameter_jump = jump_with_parameter(network_data) # 5.83e5

andrewrosemberg avatar Feb 28 '24 16:02 andrewrosemberg

Ah. We probably need to throw an error if parameters are combined with the old nonlinear API. I don't know if that is tested.

odow avatar Feb 28 '24 19:02 odow

See https://github.com/jump-dev/Ipopt.jl/pull/406. You just can't mix the two.

odow avatar Feb 28 '24 20:02 odow

I am trying to get the dual of the parameter when using Ipopt but failing miserably.

Modifying @odow example a little bit:

function main_ipopt()
           model = Model(Ipopt.Optimizer)
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, sin(x) + y >= z) # modified constraint
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           @show dual(z)
           return
           end

I get (as expected) the error:

ERROR: To query the dual variables associated with a variable bound, first obtain a constraint reference using one of `UpperBoundRef`, `LowerBoundRef`, or `FixRef`, and then call `dual` on the returned constraint reference.

But none of these work (e.g., dual(FixRef(z))).

In POI I normally run: MOI.get(JuMP.owner_model(z), POI.ParameterDual(), z)

But Ipopt is not currently working with POI on a non-linear problem. If I set the solver like:

ipopt = Ipopt.Optimizer()
cached =
    () -> MOI.Bridges.full_bridge_optimizer(
        MOI.Utilities.CachingOptimizer(
            MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
            ipopt,
        ),
        Float64,
)
POI_cached_optimizer() = POI.Optimizer(cached())
set_optimizer(model, () -> POI_cached_optimizer())

I get:

ERROR: MathOptInterface.UnsupportedConstraint{MathOptInterface.ScalarNonlinearFunction, MathOptInterface.EqualTo{Float64}}: `MathOptInterface.ScalarNonlinearFunction`-in-`MathOptInterface.EqualTo{Float64}` constraint is not supported by the model.

But I am a bit lost on where to start to solve this.

andrewrosemberg avatar May 08 '24 18:05 andrewrosemberg

Could and should I try implementing something like the ParameterDual in MOI?

https://github.com/jump-dev/ParametricOptInterface.jl/blob/4ec565acdac480a19e772deb3274636ceef2b3b7/src/duals.jl#L96

andrewrosemberg avatar May 08 '24 18:05 andrewrosemberg

MathOptInterface does not support computing the dual of a Parameter. If you need this, use a fixed variable instead.

odow avatar May 08 '24 20:05 odow

I changed the parameters that I need the dual to a constraint instead but I think this breaks POI:


using JuMP
using Gurobi
import ParametricOptInterface as POI

model = JuMP.Model(() -> POI.Optimizer(Gurobi.Optimizer()))

@variable(model, x <= 1)
@variable(model, y)
@variable(model, z in MOI.Parameter(1.0))

@objective(model, 10 * x + 1000 * y)

c = @constraint(model, Min, x + y == 1 + z) # 1 represents the parameter that I want the dual and z is just another parameter.

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

set_normalized_rhs(c, 0.0)

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

set_parameter_value(z, 2)

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

I am avoiding the JuMP.fix and using instead just the RHS to help my API.

andrewrosemberg avatar May 10 '24 01:05 andrewrosemberg

I am avoiding the JuMP.fix

To get the dual of a particular parameter, you really need to use @variable(model, z == 1) and reduced_cost(z).

odow avatar May 10 '24 01:05 odow