pytorch-forecasting
                                
                                 pytorch-forecasting copied to clipboard
                                
                                    pytorch-forecasting copied to clipboard
                            
                            
                            
                        How to plot predictions using DeepAR
- PyTorch-Forecasting version:
- PyTorch version: 0.10.1
- Python version: 3.8.10
- Operating System: Windows 10
Expected behaviour
I ran the training for DeepAR, and I was expecting when running the following code is to get plots for the wells for predictions vs actuals
raw_predictions, x = deepar.predict(val_dataloader, mode="raw", return_x=True)
for idx in range(10):  # plot 10 examples
    deepar.plot_prediction(x, raw_predictions, idx=idx, add_loss_to_title=True)
Now, I am doing this manually and I'm not sure if this is the correct way to do it:
best_deepar.to(torch.device('cpu'))
actuals = torch.cat([y[0] for x, y in iter(val_dataloader)])
raw_predictions = best_deepar.predict(val_dataloader, mode='prediction')
for i in range(0,100, 5):
    plt.plot(actuals[i].numpy(), label='actual')
    plt.plot(raw_predictions[i].numpy(), label='prediction')
    plt.legend()
    plt.show()
Actual behavior
The error that results shows:
ValueError: Expected parameter scale (Tensor of shape (220, 24)) of distribution Normal(loc: torch.Size([220, 24]), scale: torch.Size([220, 24])) to satisfy the constraint GreaterThan(lower_bound=0.0), but found invalid values:
Code to reproduce the problem
raw_predictions, x = deepar.predict(val_dataloader, mode="raw", return_x=True)
for idx in range(10):  # plot 10 examples
    deepar.plot_prediction(x, raw_predictions, idx=idx, add_loss_to_title=True)
TimeSeriesDataset code:
# Prepare the training data
training = TimeSeriesDataSet(
    training_df,
    time_idx="time_idx",
    target="pd",
    group_ids=["shop_id"],
    max_encoder_length=data_prep_params["time_series_dataset"]["max_encoder_length"],
    min_encoder_length=data_prep_params["time_series_dataset"]["min_encoder_length"],
    min_prediction_length=data_prep_params["time_series_dataset"][
        "min_prediction_length"
    ],
    max_prediction_length=data_prep_params["time_series_dataset"][
        "max_prediction_length"
    ],
    static_categoricals=["area_id"],
    static_reals=["avg_area_sales",
        "avg_shop_sales",
        "avg_area_sales_per_month"
        if data_prep_params["preprocess_train"]["add_per_month"]
        else None,
        "avg_shop_sales_per_month"
        if data_prep_params["preprocess_train"]["add_per_month"]
        else None,
        "avg_area_sales_per_year"
        if data_prep_params["preprocess_train"]["add_per_year"]
        else None,
        "avg_shop_sales_per_year"
        if data_prep_params["preprocess_train"]["add_per_year"]
        else None,],
    time_varying_known_reals=["time_idx"],
    time_varying_unknown_reals=[
        "sales",
    ],
    lags={
        "sales": [1, 2, 3, 4, 5, 6],
    },
    allow_missing_timesteps=True,
    add_target_scales=True,
    add_encoder_length=False,
    categorical_encoders={
        "__group_id__shop_id": NaNLabelEncoder(add_nan=True),
        "area_id": NaNLabelEncoder(add_nan=True),
        "shop_id": NaNLabelEncoder(add_nan=True),
    },
    target_normalizer=GroupNormalizer(groups=["shop_id"]),
    add_relative_time_idx=False,
)
Code to initialize DeepAR model:
deepar = DeepAR.from_dataset(
        training,
        learning_rate=0.1,
        hidden_size=32,
        dropout=0.1,
        loss=NormalDistributionLoss(),
        log_interval=10,
        log_val_interval=3,
        # reduce_on_plateau_patience=3,
    ) 
Thank you
have you tried like this? Somebody posted this in another issue:
for idx in range(180,200): 
    best_model.plot_prediction(x, raw_predictions, idx=idx, add_loss_to_title=True, quantiles_kwargs={'use_metric': False}, prediction_kwargs={'use_metric': False})