pytorch-ts icon indicating copy to clipboard operation
pytorch-ts copied to clipboard

Density estimation

Open ZhuangweiKang opened this issue 2 years ago • 14 comments

Hi, this is amazing work. Is that possible to get the log-likelihood of predicted samples using this framework?

ZhuangweiKang avatar Jan 26 '22 21:01 ZhuangweiKang

thanks! well typically the loss is the log-likelihood of the samples... but you are right I do not log it during inference. Let me see if I can add that...

kashif avatar Jan 27 '22 07:01 kashif

Thank you for answering the question. For inference, the paper shows that the model can start from some initial warm-up time-series then iteratively call the RNN and flow till the inference horizon. Could you please point out which function implements this?

ZhuangweiKang avatar Jan 28 '22 02:01 ZhuangweiKang

@ZhuangweiKang this is a property of all autoregressive models for example have a look at:

https://github.com/zalandoresearch/pytorch-ts/blob/master/pts/model/deepar/deepar_network.py#L361

for the loop over the prediction length where the model samples the next time step and concats it and then samples again...

kashif avatar Jan 28 '22 06:01 kashif

@kashif Thank you very much. This helps a lot. So when I make inference, I only need to use a warm-up time series as input of the predict() function. It will produce the same number of samples as the prediction_length specified in the training function. Is that correct?

ZhuangweiKang avatar Jan 28 '22 06:01 ZhuangweiKang

yes i believe so... although it will produce a tensor of shape [B, S, T, 1] where B is the batch size, S is the number of samples from the distribution (e.g. 100), T is the prediction length and for univariate you get a single output and for multivariate it will be the multivariate dim.

kashif avatar Jan 28 '22 07:01 kashif

I didn't see the batch size dimension.

ZhuangweiKang avatar Jan 28 '22 07:01 ZhuangweiKang

it's implied by the -1 so that it works for any batch size: https://github.com/zalandoresearch/pytorch-ts/blob/master/pts/model/deepar/deepar_network.py#L407

kashif avatar Jan 28 '22 07:01 kashif

Does this mean I have to retrain the model if I want to forecast a different size?

ZhuangweiKang avatar Jan 28 '22 07:01 ZhuangweiKang

no I do not think so...

kashif avatar Jan 28 '22 07:01 kashif

So how to reset the prediction length when I call the predict function? Thanks.

def predict( self, dataset: Dataset, num_samples: Optional[int] = None ) -> Iterator[Forecast]: inference_data_loader = InferenceDataLoader( dataset, transform=self.input_transform, batch_size=self.batch_size, stack_fn=lambda data: batchify(data, self.device), )

ZhuangweiKang avatar Jan 28 '22 07:01 ZhuangweiKang

so i do not know what you are trying to do... but typically you set the prediction length to be as large as you have test data for... so that you can then compare the resulting metrics...

if you require smaller prediction length to what you train with then just concat the prediction... if you want bigger then just train with a bigger prediction length...

kashif avatar Jan 28 '22 07:01 kashif

Got it. Thanks for your help.

ZhuangweiKang avatar Jan 28 '22 07:01 ZhuangweiKang

no I do not think so...

From the source code, I think if changing the prediction length, one must retrain the model. because the input to rnn must be the tensor of (batch_size, sub_seq_len, input_dim), and sub_seq_len=context_length+prediction_length. So, I mean if you want to change the prediction length, you must change the meta information from dataset.metadata.predition_length and redivide the dataset.train and dataset.test... Am I to understand it this way?

hanlaoshi avatar Nov 07 '22 13:11 hanlaoshi

thanks! well typically the loss is the log-likelihood of the samples... but you are right I do not log it during inference. Let me see if I can add that...

Hello, could this option be implemented? I would like to access the log likelihood of each series.

sergio825 avatar Feb 04 '24 02:02 sergio825