lcqlalala

Results 3 issues of lcqlalala

We newly defined the _**prompt tensor**_ in the LlamaModel class: self.prompt = torch.nn.parameter.Parameter(torch.randn(self.embed_dim), requires_grad=True), when loading LLAMA weights: model = LlamaForCausalLM.from_pretrained( base_model, load_in_8bit=True, torch_dtype=torch.float16, device_map=device_map, ) ValueError: prompt is on...

Traceback (most recent call last): File "/mnt/ssd/lcq/qlora-main/qlora.py", line 791, in train() File "/mnt/ssd/lcq/qlora-main/qlora.py", line 773, in train **predictions = tokenizer.batch_decode(** File "/mnt/ssd/lcq/conda_env/qlora/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 3464, in batch_decode return [ File "/mnt/ssd/lcq/conda_env/qlora/lib/python3.9/site-packages/transformers/tokenization_utils_base.py",...