autogluon
autogluon copied to clipboard
[BUG] Cannot Hyperparameter tune RecursiveTabular with TimeSeriesPredictor
Bug Report Checklist
- [x] I provided code that demonstrates a minimal reproducible example.
- [ ] I confirmed bug exists on the latest mainline of AutoGluon via source install.
- [x] I confirmed bug exists on the latest stable version of AutoGluon.
Describe the bug
The documentation shows how to tune DeepAR for TimeSeries, but doesn't show how to do RecursiveTabular. As such, I'm getting unexepcted messages given setups that seem reasonable.
predictor = TimeSeriesPredictor(
prediction_length=21,
freq='h',
)
predictor.fit(
train_data=train_data,
hyperparameters={
"DeepAR": {},
"RecursiveTabular": [
{"tabular_hyperparameters": {"GBM": { "num_leaves": space.Int(5, 50)}}}
],
},
hyperparameter_tune_kwargs={
"scheduler": "local",
"searcher": "auto",
"num_trials": 5,
},
enable_ensemble = False,
time_limit = 600
)
with this error:
ValueError: Hyperparameter tuning specified, but no model contains a hyperparameter search space. Please disable hyperparameter tuning with `hyperparameter_tune_kwargs=None` or provide a search space for at least one model.
Expected behavior
I'd expect this to work given how hyperparameters are defined in https://github.com/autogluon/autogluon/issues/2969, though the documentation isn't very clear at all.
To Reproduce
I've tried maybe 15 different ways of setting up the configs. I've poured through https://github.com/autogluon/autogluon/blob/579ede12d9157778d90c617c4bfddd0b4865a582/timeseries/src/autogluon/timeseries/models/presets.py#L314-L331 to understand what types of parameter combinations are expected.
I've also tried this:
"RecursiveTabular": [
{"GBM": { "num_leaves": space.Int(5, 50)}}
],
and I get the same errors. Trying this runs the HPO process, but is masked by the error from https://github.com/autogluon/autogluon/issues/2969
"RecursiveTabular": [
{ "num_leaves": space.Int(5, 50)}
],
Installed Versions
accelerate : 0.21.0 autogluon : 1.1.1 autogluon.common : 1.1.1 autogluon.core : 1.1.1 autogluon.features : 1.1.1 autogluon.multimodal : 1.1.1 autogluon.tabular : 1.1.1 autogluon.timeseries : 1.1.1 boto3 : 1.34.101 catboost : 1.2.5 defusedxml : 0.7.1 evaluate : 0.4.2 fastai : 2.7.15 gluonts : 0.15.1 hyperopt : 0.2.7 imodels : None jinja2 : 3.1.4 joblib : 1.4.2 jsonschema : 4.21.1 lightgbm : 4.3.0 lightning : 2.3.2 matplotlib : 3.9.1 mlforecast : 0.10.0 networkx : 3.3 nlpaug : 1.1.11 nltk : 3.8.1 nptyping : 2.4.1 numpy : 1.26.4 nvidia-ml-py3 : 7.352.0 omegaconf : 2.2.3 onnxruntime-gpu : None openmim : 0.3.9 optimum : 1.18.1 optimum-intel : 1.16.1 orjson : 3.10.6 pandas : 2.2.2 pdf2image : 1.17.0 Pillow : 10.4.0 psutil : 5.9.8 pytesseract : 0.3.10 pytorch-lightning : 2.3.2 pytorch-metric-learning: 2.3.0 ray : 2.10.0 requests : 2.32.3 scikit-image : 0.20.0 scikit-learn : 1.4.0 scikit-learn-intelex : None scipy : 1.12.0 seqeval : 1.2.2 setuptools : 69.5.1 skl2onnx : None statsforecast : 1.4.0 tabpfn : None tensorboard : 2.17.0 text-unidecode : 1.3 timm : 0.9.16 torch : 2.3.1 torchmetrics : 1.2.1 torchvision : 0.18.1 tqdm : 4.66.4 transformers : 4.39.3 utilsforecast : 0.0.10 vowpalwabbit : None xgboost : 2.0.3