text-generation-webui
text-generation-webui copied to clipboard
Error when loading custom finetuned lora: AttributeError: 'NoneType' object has no attribute 'device'
Describe the bug
This is the same bug as https://github.com/tloen/alpaca-lora/issues/14.
Here's the fix I used to workaround this issue: https://github.com/tloen/alpaca-lora/issues/14#issuecomment-1471263165
See suggested fix below.
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
Use a custom trained lora (such as llama-13b with alpaca dataset), load it with webui, got the error.
Screenshot
No response
Logs
File "C:\Users\jason\miniconda3\envs\textgen\lib\site-packages\bitsandbytes\functional.py", line 1698, in transform
prev_device = pre_call(A.device)
AttributeError: 'NoneType' object has no attribute 'device'
System Info
Windows 11 Pro. RTX 4090.
Suggested fix for people encountering this bug:
Add device_map={'': 0}
to line 18 of modules/LoRA.py
, inside PeftModel.from_pretrained
.
This forces Peft to load the model with GPU, and should work with most people with single GPUs. For users with multiple GPUs, feel free to edit device_map
to your liking.
So to clarify, the changes I had to apply was in generate.py:
model = PeftModel.from_pretrained(
model, "tloen/alpaca-lora-7b",
torch_dtype=torch.float16
)
change this to:
model = PeftModel.from_pretrained(
model, "tloen/alpaca-lora-7b",
torch_dtype=torch.float16,
device_map={'': 0}
)
So to clarify, the changes I had to apply was in generate.py:
model = PeftModel.from_pretrained( model, "tloen/alpaca-lora-7b", torch_dtype=torch.float16 )
change this to:
model = PeftModel.from_pretrained( model, "tloen/alpaca-lora-7b", torch_dtype=torch.float16, device_map={'': 0} )
Weird, I don't have these lines in my generate.py in venv
I think that this has been fixed with all the recent updates to LoRA.py
@oobabooga sorry for the bad question, but how do we get these updates? Not sure which library to pip install
.
Just git pull
inside the text-generation-webui
folder
Sorry. I dont actually have this repo installed. From my research looks like the latest iteration of PEFT needs to be pulled down. Thanks for your help! I am going to get this repo once I am familiar enough with the ins and outs of this stuff.