RecBole
RecBole copied to clipboard
[🐛BUG] AttributeError: module 'recbole.model.exlib_recommender.xgboost' has no attribute 'xgboost'
I try to learn a Context Aware Model with xgboost but I receive next error:
AttributeError: module 'recbole.model.exlib_recommender.xgboost' has no attribute 'xgboost'
recbole.version == '1.1.1'
!pip install recbole
from tqdm.notebook import tqdm
from recbole.quick_start import run_recbole
# models = ['LR', 'FM', 'NFM', 'DeepFM', 'xDeepFM', 'AFM', 'FFM', 'FwFM', 'FNN', 'PNN',
# 'DSSM', 'WideDeep', 'DCN', 'DCN V2', 'AutoInt', 'xgboost', 'lightgbm']
# 'DIN', 'DIEN'
models = ['xgboost', 'lightgbm']
all_results = []
for model in tqdm(models):
results = run_recbole(model=model, dataset='ml-100k', config_file_list=['test.yaml'])
results['model'] = model
print(results)
all_results.append(results)
I understood your problem: in your docs Xgboost model initializes like:
run_recbole(model='xgboost', dataset='ml-100k')
But it's wrong, change it to:
run_recbole(model='XGBoost', dataset='ml-100k')
But I have the following error:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[11], line 10
8 all_results = []
9 for model in tqdm(models):
---> 10 results = run_recbole(model=model, dataset='ml-100k', config_file_list=['test.yaml'])
11 results['model'] = model
12 print(results)
File /opt/conda/lib/python3.10/site-packages/recbole/quick_start/quick_start.py:88, in run_recbole(model, dataset, config_file_list, config_dict, saved)
85 trainer = get_trainer(config["MODEL_TYPE"], config["model"])(config, model)
87 # model training
---> 88 best_valid_score, best_valid_result = trainer.fit(
89 train_data, valid_data, saved=saved, show_progress=config["show_progress"]
90 )
92 # model evaluation
93 test_result = trainer.evaluate(
94 test_data, load_best_model=saved, show_progress=config["show_progress"]
95 )
File /opt/conda/lib/python3.10/site-packages/recbole/trainer/trainer.py:1005, in DecisionTreeTrainer.fit(self, train_data, valid_data, verbose, saved, show_progress)
1001 def fit(
1002 self, train_data, valid_data=None, verbose=True, saved=True, show_progress=False
1003 ):
1004 for epoch_idx in range(self.epochs):
-> 1005 self._train_at_once(train_data, valid_data)
1007 if (epoch_idx + 1) % self.eval_step == 0:
1008 # evaluate
1009 valid_start_time = time()
File /opt/conda/lib/python3.10/site-packages/recbole/trainer/trainer.py:1114, in XGBoostTrainer._train_at_once(self, train_data, valid_data)
1112 self.dvalid = self._interaction_to_lib_datatype(valid_data)
1113 self.evals = [(self.dtrain, "train"), (self.dvalid, "valid")]
-> 1114 self.model = self.xgb.train(
1115 self.params,
1116 self.dtrain,
1117 self.num_boost_round,
1118 self.evals,
1119 early_stopping_rounds=self.early_stopping_rounds,
1120 evals_result=self.evals_result,
1121 verbose_eval=self.verbose_eval,
1122 xgb_model=self.boost_model,
1123 callbacks=self.callbacks,
1124 )
1126 self.model.save_model(self.temp_file)
1127 self.boost_model = self.temp_file
File /opt/conda/lib/python3.10/site-packages/xgboost/core.py:620, in require_keyword_args.<locals>.throw_if.<locals>.inner_f(*args, **kwargs)
618 for k, arg in zip(sig.parameters, args):
619 kwargs[k] = arg
--> 620 return func(**kwargs)
File /opt/conda/lib/python3.10/site-packages/xgboost/training.py:182, in train(params, dtrain, num_boost_round, evals, obj, feval, maximize, early_stopping_rounds, evals_result, verbose_eval, xgb_model, callbacks, custom_metric)
171 cb_container = CallbackContainer(
172 callbacks,
173 metric=metric_fn,
(...)
177 output_margin=callable(obj) or metric_fn is feval,
178 )
180 bst = cb_container.before_training(bst)
--> 182 for i in range(start_iteration, num_boost_round):
183 if cb_container.before_iteration(bst, i, dtrain, evals):
184 break
TypeError: 'NoneType' object cannot be interpreted as an integer
I solved this problem too: I just added into config:
xgb_num_boost_round: 100
Please add default value to this parameter
Same thing with LightGBM, add default values:
lgb_params: {}
lgb_num_boost_round: 300
@dexforint Thank you for your suggestions! We have just added the config information into our docs and config file. And Thanks again for your advises.