Cristian Challu
Cristian Challu
Awesome, thanks for sharing the code!
Yes, we always want to keep the last cutoff. This is the `predict_insample`, so users do not specify the `n_windows` parameter. We should always return the last cutoff, and consider...
Hi @iamyihwa! Have you tried using the optuna backend for Auto models?
Hi @WenjuanOlympus! Sorry for the late reply. The original implementation has future exogenous covariates, is part of their `DataEmbedding` (https://github.com/thuml/Time-Series-Library/blob/main/layers/Embed.py#L109). They parse the exogenous covariates with a linear layer (`TimeFeatureEmbedding`)....
Hi @tg2k! The input dataframe is expected to be balanced: it has a complete set of observations (rows) between the first and last dates for each time series for the...
Hi @almostintuitive. Indeed, our `cross_validation` does not retrain the model for every window. In our experience, this is the most common approach. Retraining for every new data point will be...
It is a little more complex than that because you have to modify the trainer kwargs. Here is the code we usually use for this: ``` def _set_trainer_kwargs(nf, max_steps, early_stop_patience_steps):...
Thanks for this! We should wait a few days; we are adding the iTransformer and MLPMultivariate. Can you also add them once they are in the main branch?
@marcopeix we also need to add KAN to the evaluation pipeline in `https://github.com/Nixtla/neuralforecast/tree/main/action_files`. You can check with Olivier how to add it,
Hi @patel-zeel. Sorry for the delay on the answer. Yes, we would like to understand if it also improves on GPU. Can you try using Colab? We can add the...