neuralforecast
neuralforecast copied to clipboard
Scalable and user friendly neural :brain: forecasting algorithms.
Add missing models to the README file: - NLinear - TSMixer - TSMixerx
### What happened + What you expected to happen When tuning AutoNHITS with Optuna backend (without using Ray Tune) I occasionally get this error: ``` neuralforecast/lib64/python3.8/site-packages/torch/distributions/distribution.py", line 68, in __init__...
### What happened + What you expected to happen ╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /Users/leo/web3/LLM/langchain/mlts/nf_iTransformer.py:47 in │ │ │ │ 44 # model_index=None, │ │ 45 #...
### What happened + What you expected to happen (_train_tune pid=59932) /Users/leo/web3/LLM/langchain/venv/lib/python3.10/site-packages/ray/tune/integration/pytorch_lightning.py:198: `ray.tune.integration.pytorch_lightning.TuneReportCallback` is deprecated. Use `ray.tune.integration.pytorch_lightning.TuneReportCheckpointCallback` instead. (_train_tune pid=59932) Seed set to 1 2024-05-01 01:27:11,649 ERROR tune_controller.py:1331 -- Trial...
### Description The original implementation of DLinear (and NLinear) does not support exogenous variables. In one publication (can't remember which one) I read that DLinear does not support them because...
### Description KANs have faster scaling than MLPs. KANs have better accuracy than MLPs with fewer parameters. ### Use case _No response_
### Description Is it possible to change the learning rate when a model is loaded? I was not able to find any related documentation. For example, if I train a...
### What happened + What you expected to happen Hi, I'm new to nixtla. When I was trying to run the example code in official tutorial on my local machine(Linux,...
**Preamble** I'm trying to fit models for biological systems where we have a large amount of experimental data, ergo I have multiple multivariate trajectories from the same biological system that...
Added results from iTransformer (TSMixer still wins on ETTm2) Added reference for iTransformer Rename notebook to a more general title