unsloth icon indicating copy to clipboard operation
unsloth copied to clipboard

Any solution for MultiGPU

Open gotzmann opened this issue 1 year ago • 9 comments

I'm new to the land of LLM fine-tuning and after trying LLaMA-Factory and Axolotl I've started adopting Unsloth for better performance on memory limited cards like RTX A6000 with 48Gb.

But now I reach the point where there it's too limiting for me to use just one GPU and I should find the way to use Unsloth with multiple GPUs or move somewhere.

Please give some information is it possible to use Unsloth with Accelerate or DeepSpeed in multiGPU configuration?

I'm waiting for native multiGPU support release, just want to start using any other solution right now.

gotzmann avatar Jan 19 '24 18:01 gotzmann

@gotzmann Thanks for using Unsloth again!! :) Sadly multi GPU is not yet supported for now - we're working on it for a future release in the OSS version

danielhanchen avatar Jan 20 '24 08:01 danielhanchen

@danielhanchen

Do you think it's possible with LLaMA-Factory? Seems like there some hope there:

https://github.com/hiyouga/LLaMA-Factory/wiki/Performance-Comparison#nvidia-a100--2

gotzmann avatar Jan 20 '24 11:01 gotzmann

@gotzmann We're actively working with the Llama Factory team - for now it's in experimental mode - there will be intermittent seg faults randomnly for now and I haven't confirmed myself if the training losses match exactly, ie if the results are even accurate / correct - so wait patiently for news coming very soon :))

danielhanchen avatar Jan 20 '24 12:01 danielhanchen

Is multi-gpu available or not?

MUZAMMILPERVAIZ avatar Feb 04 '24 15:02 MUZAMMILPERVAIZ

@MUZAMMILPERVAIZ It's under active development. So currently no.

danielhanchen avatar Feb 04 '24 15:02 danielhanchen

@danielhanchen Any news yet?

walmartbaggg avatar Apr 04 '24 03:04 walmartbaggg

Do you have any projected timelines for this?

kdcyberdude avatar May 01 '24 19:05 kdcyberdude

No sorry currently not - so many new model releases and bugs - sorry I can't keep up and so I have to prioritize since sadly Unsloth's team is just 2 people (me and my bro), and I primarily focus on algos - so please be patient! Apologies again!

danielhanchen avatar May 04 '24 09:05 danielhanchen

@danielhanchen -- As unsloth's one of the primary strengths lie in fine tuning and multi-gpu is one of the most important feature, I think we should prioritize this.

I can contribute/assist to the multi-gpu backlogs, thanks.

chintan-ushur avatar May 22 '24 09:05 chintan-ushur