Siwon Seo
Siwon Seo
This is Hyper parameter set. I did this work with optuna, from 125 trial it doesn't progress. ``` {'booster': 'gbtree', 'objective': 'binary:logistic', 'eval_metric': 'logloss', 'learning_rate': 0.2017833004505858, 'n_estimators': 825, 'max_depth': 10,...
Yes I did it. ``` dtrain = xgboost.DMatrix(data=X_train, label=y_train, enable_categorical=True) ``` I don't have sample weight and feature weight. I just set `'scale_pos_weight': 13.479705048987812` for data valance. (Before 124 trial,...
I tried both: `n_jobs = -1` and `n_jobs = 1` . I'm using `xgboost-2.0.0.dev0%2B15ca12a77ebbaf76515291064c24d8c2268400fd-py3-none-manylinux2014_x86_64.whl` now.
Thank you so much for your help! ```python import numpy as np import pandas as pd from sklearn.model_selection import train_test_split from sklearn.metrics import f1_score import xgboost import optuna ''' Load...
And one more, when I tried ```python import xgboost print(xgboost.__version__) ``` I get `2.0.0-dev` but `device` Hyper-parameter doesn't work.
Hello again. I installed lastest version `2.0.0rc1`. but I still have been encountering persistent errors. ``` _xgboost.py", line 120, in study.optimize(Objective, n_trials=550, n_jobs=1) File "/home/siwon/.local/lib/python3.10/site-packages/optuna/study/study.py", line 442, in optimize _optimize(...
I got this warning too. ``` /home/siwon/.local/lib/python3.10/site-packages/xgboost/core.py:160: UserWarning: [19:23:31] WARNING: /workspace/src/gbm/gbtree.cc:94: Falling back to prediction using DMatrix due to mismatched devices. This might lead to higher memory usage and slower...
So, wasn't it apply the fix (#9529)? I'll re-install xgboost lastest version.
As far as I know, It's considering but didn't developed yet. You need to build it your self. The one that easiest way is using `ray` libaray I think or...
This issue is solved as #3257 full request.