StackExchangeCodes
StackExchangeCodes copied to clipboard
ridge/lasso regression with non -negative coefficients when data is sparse one hot data
great code but do you have code for ridge/lasso regression with non -negative coefficients when data is sparse one hot data and data matrix is very big? for example data matrix is
1 0 0 0 1 0 1 0 1 0 0 0 0 1 1 etc
for example for FISTA solver?
Please open a question on Signal Processing Stack Exchange, link it here and I will answer it.
Pay attention the code isn't for commercial use.
sure done link is https://dsp.stackexchange.com/questions/77048/plain-python-numpy-code-for-ridge-lasso-regression-with-non-negative-coefficient
The question in the link isn't what's you asked above.
If you ask how to solve $ \frac{1}{2} {\left| A x - b \right|}{2}^{2} + \lambda {\left| x \right|}{1} $ subject to $ {x]_{i} \geq 0 $ then I can help and it is a valid question.
yes this optimization is part of question is there some specific if data is sparse and values are 0s and 1s - one hot data?
do you want me to create new question exactly like $ \frac{1}{2} {\left| A x - b \right|}{2}^{2} + \lambda {\left| x \right|}{1} $ subject to $ {x]_{i} \geq 0 $ ?
in any case I created new question https://dsp.stackexchange.com/questions/78077/plain-python-code-for-ridge-lasso-multivariate-regression-for-non-negative-coeff
Thanks