InferOpt.jl
InferOpt.jl copied to clipboard
Combinatorial optimization layers for machine learning pipelines
We should be able to provide keywords arguments as input of the `SPOPlusLoss`, in a similar way as for the `FenchelYoungLoss`. Also needs to be implemented for `SSVMLoss`.
It would be nice to be compatible with Julia's Long Term Support (LTS) version 1.6. I think the main obstacle at the moment is the use of the destructuring syntax...
Currently, `InferOpt` fully supports predictors of the form $\arg\max_y \theta^\top y$ in combinatorial layers. It would be interesting to allow the more general form $$\arg\max_y \theta^\top g(y) + h(y)$$ For...
At the moment, InferOpt uses ChainRulesCore for rrules, but it would be nice to be compatible with ForwardDiff dual numbers
Show an example in the docs that uses eg. LBFGS instead of SGD
Add an option to created parallelized version of `PerturbedAdditive` and `PerturbedMultiplicative`, as a keyword argument `is_parallel` in their respective constructors. See #29.
Julia 1.8 is no longer a pre-release
See #25. - [x] `PerturbedAdditive` - [x] `PerturbedMultiplicative` - [ ] `SPOPLusLoss` - [ ] `StructuredSVM` - [ ] `RegularizedGeneric` - [ ] More tests
It would be interesting to be able to run perturbed maximizers in parallel, especially when `nb_samples` is high or/and the combinatorial algorithm has a long runtime. One option would be...