neuralforecast
neuralforecast copied to clipboard
There are missing combinations of ids and times in `futr_df`.
What happened + What you expected to happen
When I use PatchTST for stock price prediction, the following error occured in nf.predict(futr_df=y_test):There are missing combinations of ids and times in futr_df
.My data is stock prices for the last ten years, and since stocks don't trade during the holidays, there are no stock prices during those times, so the 'ds' in my data is not completely continuous
Versions / Dependencies
pytorch neuralforcast
Reproduction script
model = PatchTST(h=17, input_size=100, patch_len=24, stride=24, revin=False, hidden_size=16, n_heads=4, scaler_type='robust', loss=MAE(), learning_rate=1e-3, max_steps=500, val_check_steps=50, early_stop_patience_steps=2)
nf = NeuralForecast( models=[model], freq='D' ) nf.fit(df=y_train, val_size=17) forecasts = nf.predict(futr_df=y_test)
Issue Severity
High: It blocks me from completing my task.
Does it work if you set freq='B'
? Otherwise you can just use the nf.make_future_dataframe()
method to build the dates and replace your vales in there.
Does it work if you set
freq='B'
? Otherwise you can just use thenf.make_future_dataframe()
method to build the dates and replace your vales in there.
Yes, I had tried to set freq='B'
, but it didn't work, I got the same error again. And I have used the nf.make_future_dataframe()
to deal with my y_test
, it just produced two columns of data :unique_id
and ds
, and the time series in the ds
is still discontinuous.When I combined these two new columns of data with the original data y
and use the nf.predict()
, the same problem still occured
When I combined these two new columns of data with the original data y and use the nf.predict(), the same problem still occured
This is very unlikely, since that's the structure used by predict. If you can provide a reproducible example of that behavior we can help further.
This is very unlikely, since that's the structure used by predict. If you can provide a reproducible example of that behavior we can help further.
Sorry,it's my fault that i didn't use the nf.make_future_dataframe()
correctly,I have corrected my mistakes and got continuous ds
data,then I used the nf.predict(futr_df=y_test)
,but the same error occured again,the y_test
data is shown in the picture.
Are you providing the target as a future exogenous feature? If you're not using exogenous features you don't need to provide futr_df
.
Are you providing the target as a future exogenous feature? If you're not using exogenous features you don't need to provide .
futr_df
Yes,I think I should take the target as a future exogenous feature.In fact, I learned the usage of PatchTST from the official documentation, and the futr_df
is provided in the official documentation.
Yes,I think I should take the target as a future exogenous feature
So you want to predict y
using y
as a feature?
So you want to predict
y
usingy
as a feature?
Yes, I think so, because that's how it seems to be used in the official documentation. And I do want to predict y
, then visualize the results of the prediction, just like in the example.
Which documentation are you referring to? I'm pretty sure that just nf.predict()
would work for your case.
I RUN INTO IT TOO,complete disaster.
i was using TFT,and i did not predict y by y ,and the error ocured. by the way what should i put at futr_df?
Which documentation are you referring to? I'm pretty sure that just
nf.predict()
would work for your case.
It's in Nixtla's official documentation of PatchTST usage example,and the URL is https://nixtlaverse.nixtla.io/neuralforecast/models.patchtst.html#patchtst.
That example is wrong, it's not using any exogenous features. Can you try just running nf.predict()
?
This issue has been automatically closed because it has been awaiting a response for too long. When you have time to to work with the maintainers to resolve this issue, please post a new comment and it will be re-opened. If the issue has been locked for editing by the time you return to it, please open a new issue and reference this one.