text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

Training model results in ‘MatmulLtState’ object has no attribute ‘memory_efficient_backward’

Open r4stl1n opened this issue 2 years ago • 1 comments

Describe the bug

When attempting to train a model a stack trace occurs regarding the MatmulLtState. I believe this is within the bitsandbytes library but after trying multiple versions was unable to solve the issue.

Is there an existing issue for this?

  • [X] I have searched the existing issues

Reproduction

Pull down the latest repo, and install all dependencies. Load up any model and attempt to train it using any formatted dataset.

I tried with both the standard alpaca_data_cleaned and alpaca_format. As well has a raw text file and still getting the same issue

Screenshot

No response

Logs

Traceback (most recent call last):
File “/home/machine/ai/text-generation-webui/modules/training.py”, line 326, in do_train
lora_model = get_peft_model(shared.model, config)
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/mapping.py”, line 120, in get_peft_model
return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](model, peft_config)
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/peft_model.py”, line 662, in init
super().init(model, peft_config, adapter_name)
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/peft_model.py”, line 99, in init
self.base_model = PEFT_TYPE_TO_MODEL_MAPPING[peft_config.peft_type](
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/tuners/lora.py”, line 154, in init
self.add_adapter(adapter_name, self.peft_config[adapter_name])
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/tuners/lora.py”, line 161, in add_adapter
self._find_and_replace(adapter_name)
File “/home/machine/miniconda3/envs/textgen/lib/python3.10/site-packages/peft/tuners/lora.py”, line 213, in _find_and_replace
“memory_efficient_backward”: target.state.memory_efficient_backward,
AttributeError: ‘MatmulLtState’ object has no attribute ‘memory_efficient_backward’

System Info

OS: Ubuntu 22.04.2 LTC (x86_64)
Ram: 64gb DDR5
GPU: MSI RTX 4090 (24GB VRam)

r4stl1n avatar May 11 '23 07:05 r4stl1n

Which B*B you got installed.. and which PEFT.. I think the problem isn't in this repo.

Ph0rk0z avatar May 11 '23 15:05 Ph0rk0z

Thanks for the response @Ph0rk0z but looks like after wiping out the conda environment and pulling the latest code it fixed itself. I did get another error regarding the libbitsandbytes_cpu.so but i resolved that by doing the instructions found here:

https://github.com/oobabooga/text-generation-webui/issues/400

I am now able to train loras without a error.

r4stl1n avatar May 12 '23 04:05 r4stl1n