openchat icon indicating copy to clipboard operation
openchat copied to clipboard

Load Big Dataset Need So Much RAM

Open fahadh4ilyas opened this issue 1 year ago • 1 comments

I have a server with 8 GPU and 1TB of RAM. My parquet dataset size is 25GB (5 million data). If I'm using deepspeed to load the data, every process rank load the dataset and each need more than 150GB of RAM only for loading the parquet table and convert it to dictionary of numpy. So, the total memory usage only for loading the dataset is 150*8=1200GB memory of RAM.

Is there a way to make loading the dataset is only to the first rank and then shared to another rank?

fahadh4ilyas avatar Nov 21 '24 17:11 fahadh4ilyas

Thanks for reporting. That's a known issue as shared memory dataloader is not implemented yet, we're working on shared memory using MPI or multiprocessing, which will take only 1x RAM.

imoneoi avatar Dec 18 '24 21:12 imoneoi