Juniper.jl
Juniper.jl copied to clipboard
Simple Presolver
It would be awesome if this can be a presolver API that connects to Juniper (or any other julia-based solvers). @ccoffrin what do you think?
Different solvers will have very different needs for pre-solving, so I am not quite sure if this can be abstracted outside of the specific solvers, but it is worth investigating once more than one solver has a good presolver.
One thing that I know for sure would be valuable to many solvers would be an NLP-specific pre-solver, which does some trivial replacements; for example replacing fixed variables with constant values. These simple changes often improve IPOPTs performance quite a bit.
I've tried fixing variables in the root node by using if x < 0.8 and x binary then x = 0 this helps for some blending problems but is actually worse in some cases which I don't understand yet. I'll try to figure it out the next days.
On this topic, a nice example was provided in https://github.com/lanl-ansi/Juniper.jl/issues/156
Removing obviously redundant constraints is needed pre-solve feature.
Example working with Julia v1.7, JuMP v0.23, Juniper v0.9, Ipopt v1.0
using JuMP, Ipopt, HiGHS, Juniper
import LinearAlgebra: dot
m = Model(Juniper.Optimizer)
set_optimizer_attribute(m, "nl_solver", optimizer_with_attributes(Ipopt.Optimizer, "print_level"=>0))
set_optimizer_attribute(m, "mip_solver", optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false))
v = [10,20,12,23,42]
w = [12,45,12,22,21]
x = @variable(m, x[1:5], Bin)
@objective(m, Max, dot(v,x))
@constraint(m, sum(w[i]*x[i]^2 for i=1:5) <= 45)
@constraint(m, x[1] + x[2] == 1)
# adding a sufficient number of redundant constraints breaks the solver
for i=1:10
@constraint(m, x[1] + x[2] == 1)
end
optimize!(m)