glum
glum copied to clipboard
High performance Python GLMs with all the features!
The daily unit tests failed. See https://github.com/Quantco/glum/actions/runs/8578861406 for details.
The class does not have an argument `max_alpha`. The `min_alpha` description states > Minimum alpha to estimate the model with. The grid will then be created over `[max_alpha, min_alpha]` If...
Using glum and joblib with ray, I ran multiple models and found that threads could use 1 core, and if I set n_jobs=1, I could only use 50% of all...
I.e. a link to http://statmills.com/2023-11-20-Penalized_Splines_Using_glum/.
When modeling with `glum` using a dataset containing both categorical and numeric features, I want to manually set base levels for the categorical fields. This can be done in `statsmodels`...
This is set [here](https://github.com/Quantco/glum/blob/2649d5b4abc8f9ae56a840975086f65689c8cad0/src/glum/_solvers.py#L424). Reducing this to a more reasonable number such as 1000 results in much faster convergence and fewer warnings ``` /cluster/customapps/biomed/grlab/users/lmalte/mambaforge/envs/icufm/lib/python3.10/site-packages/glum/_solvers.py:58: ConvergenceWarning: Coordinate descent did not converge....
xref https://github.com/Quantco/glum/issues/843
We're having quite a few problems with optimization in `float32`. In small batches, these go away if we `.astype(np.float64)` our data before calling `glum.GeneralizedLinearRegressor.fit`. This also makes the algorithm much...
`glum.GeneralizedLinearRegressor(verbose=1, alpha_search=True).fit(X, y)` prints ``` Iteration 0: | | 0/? [s/it, gradient norm=7.508788257837296e-09] Iteration 1: 75%|████████████████████████████████████████████▎ | 1.5/2.0 [0.13s/it, gradient norm=0.0003172657161485404] Iteration 1: 51%|█████████████████████████████▌ | 1.02/2.0 [0.13s/it, gradient norm=0.0009597403695806861] Iteration...
I am using `glum` on an `arm64` machine. Both after installing from `conda` and `pip` (NB: why are there no arm64 linux wheels? Outputs below are from `conda` installation) and...