accelerate
accelerate copied to clipboard
Accelerate + DeepSpeed
System Info
all is the latest
Information
- [ ] The official example scripts
- [X] My own modified scripts
Tasks
- [ ] One of the scripts in the examples/ folder of Accelerate or an officially supported
no_trainer
script in theexamples
folder of thetransformers
repo (such asrun_no_trainer_glue.py
) - [ ] My own task or dataset (give details below)
Reproduction
I made deepspeed config by accelerate config
I have treid to train the 4-bit quantized model (bitsandbytes) by using deepspeed zero2 or 3 (I tried a lot for each stage 2 3)
However, there is always happening: "ValueError: .to is not supported for 4-bit or 8-bit models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correct dtype"
Accelerate's deepspeed config and bitsandbytes are not compatible???
How to solve??
Expected behavior
I want to train the 4-bit quantized model (bitsandbytes) by using deepspeed zero 2 or 3