generative-compression icon indicating copy to clipboard operation
generative-compression copied to clipboard

16G RAM is not enough?

Open Avocado5818 opened this issue 4 years ago • 6 comments

as title, I use 16G RAM , monitor show my RAM using 100% memory, Does anyone have the same problem? How many G RAM is enough? Thanks~

Avocado5818 avatar May 07 '20 09:05 Avocado5818

The dataset is pretty large, you may want to subsample the data as an additional preprocessing step and modify the config file to the new input size. Try 256 x 512 or 128 x 256 if 512 x 1024 is too large.

Justin-Tan avatar Jun 09 '20 06:06 Justin-Tan

Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.

Avocado5818 avatar Jun 09 '20 07:06 Avocado5818

I met the same problem. The computer's memory would gradually increase to full.

wsxtyrdd avatar Aug 22 '20 02:08 wsxtyrdd

you can delete line dataset = dataset.cache() in data.py to avoid this problem.

wsxtyrdd avatar Aug 22 '20 09:08 wsxtyrdd

Thanks for your answer. It really helps me.

RandomCoins avatar Sep 17 '20 07:09 RandomCoins

Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.

Hello, I've been trying to use the code recently, but the current tensorflow 2.x version is not compatible with the 1.x version, I wonder if I can borrow your pytroch version of the code, thanks a lot! it is my email [email protected]

WenBingo avatar Jul 25 '23 03:07 WenBingo