OptimizationProblems.jl
OptimizationProblems.jl copied to clipboard
Benchopt suite for machine learning
It would be nice to support the Benchopt problem suite, which is also available in Python:
- [ ] Ordinary Least Squares
- [ ] Non-Negative Least Squares
- [ ] LASSO: L1-Regularized Least Squares
- [ ] LASSO Path
- [ ] Elastic Net
- [ ] MCP
- [ ] L2-Regularized Logistic Regression
- [ ] L1-Regularized Logistic Regression
- [ ] L2-regularized Huber regression
- [ ] L1-Regularized Quantile Regression
- [ ] Linear SVM for Binary Classification
- [ ] Linear ICA
- [ ] Approximate Joint Diagonalization
- [ ] 1D Total Variation Denoising
- [ ] 2D Total Variation Denoising
- [ ] ResNet Classification
- [ ] Bilevel Optimization
We don’t have any infrastructure for bilevel problems.
Problems with (smooth or nonsmooth) regularizers should go in RegularizedProblems.jl. That includes TV problems, LASSO, etc.
Least-squares problems should go in NLSProblems.jl.
For NLS, we don't have real JuMP NLS here (they are in NLSProblems) but we do have ADNLSModel though, e.g. https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/blob/main/src/ADNLPProblems/arglina.jl.