skglm
skglm copied to clipboard
Fast and modular sklearn replacement for generalized linear models
This way one could chain calls, à la sklearn `model().fit(X, y)`: `compiled_clone(Quadratic()).initialize(X, y)` Clearly not a priority but it may be a nice synctactic sugar. Thoughts @Badr-MOUFAD ?
Given a penalty $f: \mathbb{R}^n \rightarrow \mathbb{R}$, that is already implemented in the package, It is possible to endow it with L2 regularization to get $\Omega = f + \frac{\mu}{2}...
So far nothing prevents users from changing datafits/penalties/solvers attributes and refitting a model This might be problematic, especially for - datafits that require initialization prior to usage - solvers supporting...
Follow up of #149 This implements AndersonCD solver using Jax-GPU. it proceeds as follows: - [x] CD solver using Jax - [x] Working sets - [x] Anderson acceleration - [...
Most non-expert users might not be familiar with the rescaling by `n_samples` for `alpha`. Issuing a warning when `alpha > alpha_max` and giving hints to the user (for instance by...
I discovered that we can pre-compile numba functions without having to run them by specifying the signature of the function in `@njit` ```python import time import numpy as np from...
Validation of parameters at fit time seems to be an issue, parameters which should be set by the parent class of, e.g., MCPRegressor, are not returned by `get_params()` and so...
Article by Damek Davis and Wotao Yin to solve min f + g + h, all 3 convex (generalizes Froward Backward and Douglas Rachford) also apparently work for non convex...