pytorch-forecasting icon indicating copy to clipboard operation
pytorch-forecasting copied to clipboard

RAM consumption of TimeSeriesDataset

Open nicocheh opened this issue 2 years ago • 2 comments

I have a dataframe that consumes aprox 10 G in memory. when i try to build the TimeSeriesDataset, it consumes >30G in memory (making explode my RAM). I know It makes sense because the time series dataset is a bigger structure than the dataframe.

How much can the memory consumption grow when building the time series dataset? Like a 4x? I would like to have an estimation to know how to reduce the original dataframe. Is there any way to make TimeSeriesDataset consume less RAM?

Thanks @jdb78

nicocheh avatar Jun 16 '22 22:06 nicocheh

@jdb78 Using smaller datasets, i realized that the RAM consumption gets really high and after that it goes down. Do you have any clue of which operation of TimeSeriesDataset can cause this? Do you think it could be optimized to use less RAM?

nicocheh avatar Jun 16 '22 22:06 nicocheh

I have the same problem

Fryman420 avatar Jul 03 '22 00:07 Fryman420

I am facing the same problem, not sure why, even after decreasing the max_encoder_length. I noticed that after the first epoch was finished, the memory usage increased dramatically.

binhna avatar Sep 07 '23 15:09 binhna