ParametricOptInterface.jl
ParametricOptInterface.jl copied to clipboard
[WIP] POI + DiffOpt = S2
@andrewrosemberg motivated me.
I love how well layers can play with each other.
This will not be merged (as part of POI src) as it does not make sense to add DiffOpt as a dep for POI.
This should be either: 1 - An extension here (POI) 2 - A separate package 3 - An extension at DiffOpt
Semantically, option 3 makes lots of sense. But this uses too much of POI internals. Option 2 has a similar issue, we will have to pin a POI version. Currently, I like 1.
Still requires:
- [x] Reverse mode
- [x] More tests (objectives, vector affine, more constraints...)
- [ ] Deal with cached data (a reset_input_sensitivities in DiffOpt would be handy)
- [x] Deal with type stability (add barriers)
- [ ] Move it to the right place
cc @matbesancon, @blegat
Codecov Report
Attention: Patch coverage is 94.81481%
with 14 lines
in your changes missing coverage. Please review.
Project coverage is 95.31%. Comparing base (
4ec565a
) to head (1493fac
). Report is 10 commits behind head on master.
Files with missing lines | Patch % | Lines |
---|---|---|
src/diff.jl | 94.77% | 14 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## master #143 +/- ##
==========================================
+ Coverage 94.96% 95.31% +0.35%
==========================================
Files 5 6 +1
Lines 1032 1302 +270
==========================================
+ Hits 980 1241 +261
- Misses 52 61 +9
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
I am implementing a test that I believe should work:
function test_diff_projection()
num_A = 2
##### SecondOrderCone #####
_x_hat = rand(num_A)
μ = rand(num_A) * 10
Σ_12 = rand(num_A, num_A)
Σ = Σ_12 * Σ_12' + 0.1 * I
γ = 1.0
model = direct_model(POI.Optimizer(DiffOpt.diff_optimizer(SCS.Optimizer)))
set_silent(model)
@variable(model, x[1:num_A])
@variable(model, x_hat[1:num_A] in MOI.Parameter.(_x_hat))
@variable(model, norm_2)
# (x - x_hat)^T Σ^-1 (x - x_hat) <= γ
@constraint(
model,
(x - μ)' * inv(Σ) * (x - μ) <= γ,
)
# norm_2 >= ||x - x_hat||_2
@constraint(model, [norm_2; x - x_hat] in SecondOrderCone())
@objective(model, Min, norm_2)
optimize!(model)
MOI.set.(model, POI.ForwardParameter(), x_hat, ones(num_A))
DiffOpt.forward_differentiate!(model) # ERROR
#@test TBD
return
end
But I am getting an error at the forward_differentiate
step:
ERROR: MethodError: no method matching throw_set_error_fallback(::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, ::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{Float64, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.VariableIndex}, ::Float64)
stacktrace:
Stacktrace:
[1] set(::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, ::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{Float64, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.VariableIndex}, ::Float64)
@ MathOptInterface ~/.julia/packages/MathOptInterface/IiXiU/src/attributes.jl:550
[2] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, attr::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, value::Float64)
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/copy_dual.jl:90
[3] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, attr::DiffOpt.ObjectiveDualStart, value::Float64)
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/copy_dual.jl:114
[4] _copy_dual(dest::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, src::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, index_map::MathOptInterface.Utilities.IndexMap)
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/copy_dual.jl:176
[5] _diff(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}})
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/moi_wrapper.jl:600
[6] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}})
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/moi_wrapper.jl:525
[7] forward_differentiate!(model::ParametricOptInterface.Optimizer{Float64, DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}})
@ ParametricOptInterface ~/Workspace/ParametricOptInterface.jl/src/diff.jl:222
[8] forward_differentiate!(model::Model)
@ DiffOpt ~/.julia/packages/DiffOpt/6Xx9R/src/jump_moi_overloads.jl:307
[9] top-level scope
@ REPL[48]:1
Edit: I imagine that this is a missing bridge right ?
I encountered an error while working with DiffOpt and POI. To demonstrate the problem, I created a minimal example:
This creates a simple problem using an explicitly indexed constraint (con[i=1:2]
), then applies reverse_differentiate
and it works:
using JuMP, DiffOpt, HiGHS
import ParametricOptInterface as POI
b = [1.0, 2.0]
m = Model(() -> POI.Optimizer(DiffOpt.diff_optimizer(HiGHS.Optimizer)))
@variable(m, x[1:2] >= 0)
@variable(m, c[1:2] in MOI.Parameter.(b))
@constraint(m, con[i=1:2], x[i] <= c[i])
@objective(m, Max, sum(x))
optimize!(m)
MOI.set(m, DiffOpt.ReverseVariablePrimal(), m[:x][1], 1.0)
DiffOpt.reverse_differentiate!(m)
MOI.get(m, POI.ReverseParameter(), m[:c][1])
>>> 1.0
but when I declare the constraint in a non-indexed fashion, like this:
@constraint(m, con, x <= c)
I get an error when calling DiffOpt.reverse_differentiate!(m)
:
ERROR: ArgumentError: Bridge of type `ScalarizeBridge` does not support accessing the attribute `DiffOpt.ReverseConstraintFunction()`.
The error still happens if the constraint is not declared with a name (con
in this case).