parsnip
parsnip copied to clipboard
Add xgboost as engine for linear_reg()
{xgboost}
can fit boosted, penalized linear models by setting booster="gblinear"
. This would be a great addition to linear_reg()
, or perhaps there can be a boost_linear_reg()
function and it can also include gbm()
.
Making this addition can probably take advantage of existing xgb.train()
implementation because it only requires changing booster
then using the lambda
lambda_bias
and alpha
parameters consistently with the penalty
and mixture
arguments.
This then also brings up the question of fitting pseudo random forests with {xgboost}
. Should this be added to rand_forest()
or should the num_parallel_tree
argument just be used when calling boost_tree()
?
Hi @juliasilge , as I wrote via e-mail, it could be useful to implement an xgboost model for regression with a linear booster. My target to predict derive from count or can be observed in a percentage way, I think it's better to use an ensemble model with a linear booster to test different distribution. My needs are to change the loss function because my target is skewness distributed, to manage it would be amazing if I will use linear booster parameters. thanks have a nice day MC
I would love to see a linear booster option added as well, thus enabling tuning for the three parameters listed with the linear booster in the xgboost package reference manual.