sktin
sktin
Code to replicate: ```python import xgboost as xgb print(F'{xgb.__version__=}') from sklearn.datasets import make_classification from xgboost import XGBClassifier from sklearn.metrics import roc_auc_score X, y = make_classification(50000, random_state=0) model = XGBClassifier( booster='gblinear',...
My understanding is that in version 3.0.0, the default `base_score` for those GLM-inspired loss functions has been changed to use the closed-form solution, but `reg:squaredlogerror` shouldn't be one of them....
I tried switching `colsample_bytree` between 2 values randomly for each iteration using the `reset_parameter` callback. It worked perfectly using the native API but seemed to do nothing with the scikit-learn...