skglm
skglm copied to clipboard
ENH add support for L1 + L2 regularization in SparseLogisticRegression
Currently we only support L1 in logreg: https://contrib.scikit-learn.org/skglm/generated/skglm.SparseLogisticRegression.html
We could introduce a second regularization parameter corresponding to a squared L2 regularisation, like ElasticNet is to Lasso
@PascalCarrivain would you give it a try ?
Ok, why not.