transformers icon indicating copy to clipboard operation
transformers copied to clipboard

AttributeError: 'DataParallel' object has no attribute 'device' when trying the Lora-for-sequence-classification-example

Open mrxiaohe opened this issue 2 months ago • 3 comments

System Info

  • transformers version: 4.39.3
  • Platform: Windows-10-10.0.19045-SP0
  • Python version: 3.8.12
  • Huggingface_hub version: 0.20.1
  • Safetensors version: 0.4.1
  • Accelerate version: 0.29.2
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.2.2+cu121 (True)

I was trying the example of Lora for sequence classification, here , and when I tried to train(), I got the following error:

roberta_trainer.train()
  0%|                                                                                                                                          | 0/1905 [00:00<?, ?it/s]Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\User1\Anaconda3\envs\huggingface\lib\site-packages\transformers\trainer.py", line 1780, in train
    return inner_training_loop(
  File "C:\Users\User1\Anaconda3\envs\huggingface\lib\site-packages\transformers\trainer.py", line 2118, in _inner_training_loop
    tr_loss_step = self.training_step(model, inputs)
  File "C:\Users\User1\Anaconda3\envs\huggingface\lib\site-packages\transformers\trainer.py", line 3036, in training_step
    loss = self.compute_loss(model, inputs)
  File "<stdin>", line 8, in compute_loss
  File "C:\Users\User1\Anaconda3\envs\huggingface\lib\site-packages\torch\nn\modules\module.py", line 1688, in __getattr__
    raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'DataParallel' object has no attribute 'device'

I wonder how this problem can be fixed. Thanks!

Who can help?

No response

Information

  • [ ] The official example scripts
  • [ ] My own modified scripts

Tasks

  • [ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • [ ] My own task or dataset (give details below)

Reproduction

I literally executed the code in the example without any modifications:

Lora-for-sequence-classification-with-Roberta-Llama-Mistral

Expected behavior

I expected finetuning to begin?

mrxiaohe avatar Apr 24 '24 02:04 mrxiaohe