DiffOpt.jl
DiffOpt.jl copied to clipboard
Differentiating convex optimization programs w.r.t. program parameters
DiffOpt.jl
DiffOpt is a package for differentiating convex optimization programs with respect to the program parameters. It currently supports linear, quadratic and conic programs. Refer to the documentation for examples. Powered by JuMP.jl, DiffOpt allows creating a differentiable optimization model from many existing optimizers.
Installation
DiffOpt can be installed via the Julia package manager:
julia> ]
(v1.7) pkg> add DiffOpt
Example
- Create a model using the wrapper.
using JuMP
import DiffOpt
import HiGHS
model = JuMP.Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
- Define your model and solve it a single line.
@variable(model, x)
@constraint(
model,
cons,
x >= 3,
)
@objective(
model,
Min,
2x,
)
optimize!(model) # solve
- Choose the problem parameters to differentiate with and set their perturbations.
MOI.set.( # set pertubations / gradient inputs
model,
DiffOpt.ReverseVariablePrimal(),
x,
1.0,
)
- Differentiate the model (primal, dual variables specifically) and fetch the gradients
DiffOpt.reverse_differentiate!(model) # differentiate
grad_exp = MOI.get( # -3 x - 1
model,
DiffOpt.ReverseConstraintFunction(),
cons
)
JuMP.constant(grad_exp) # -1
JuMP.coefficient(grad_exp, x) # -3
Note
- DiffOpt began as a NumFOCUS sponsored Google Summer of Code (2020) project