bitsandbytes
bitsandbytes copied to clipboard
Merge LoRA into 405B
I am blocked from merging LoRA into LLaMA-3.1-405B with INT8 of BNB, and the specific details are below. Is there any action that I can try? https://github.com/huggingface/peft/issues/2065#issue-2523808306
Error positioning: peft/utils/integrations.py im, imt, SCim, SCimt, coo_tensorim = bnb.functional.double_quant(im)
Hi @junzhang-zj Can you let us know what versions of bitsandbytes, transformers, and PEFT that you are using?
@matthewdouglas I have tried these versions bitsandbytes (0.43.3), transformers (4.44.2/4.43.3), and peft (0.12.0).
Any updates on this issue?
This should be resolved in PEFT#2245 and bitsandbytes 0.45.0. Feel free to raise again if necessary.