sparsereg icon indicating copy to clipboard operation
sparsereg copied to clipboard

Parameter mapping

Open Thomasillo opened this issue 6 years ago • 0 comments

Following conventions for the objective functions in the ElasticNet are used: sklearn: of = 1/2N C + \alpha \mu P_1 + 0.5 \alpha (1-\mu) P_2

McConaughy: of = C + \lambda \rho P_1 + \lambda(1-\rho) P_2

here, C is the (squared) two norm of the residuals and P_1 and P_2 are the regularization term. McConaughy probably also means a factor of 1/N in front of the C, otherwise the amount of regularization would scale with the number of features which doesn't make any sense.

Assuming this factor, the following formulae should be applied when mapping the regularization parameters from sparseregs interface to that of sklearn.

\alpha = \lambda ( 1-\rho/2)
\mu = \rho/(2-\rho)

Thomasillo avatar Aug 22 '18 18:08 Thomasillo