scGPT
scGPT copied to clipboard
Multiple GPU support? (Fine-tuning)
Hi, does anybody know how to make use of multiple GPUs for fine-tuning? I have tried to run the pipeline in a computing node with 2 and 3 GPUs available but weights and biases shows that only one is being used.
Each epoch takes 8h in my dataset with a .75/.25 train/test split, so I'm highly interested in increasing the speed without downsampling the data.
And using flash-attention was a no go. I spent more than two days trying to set up an environment but eventually something breaks (cuda, pytorch, scgpt, flash-attention).