adapters icon indicating copy to clipboard operation
adapters copied to clipboard

AttributeError: 'BartConfig' object has no attribute '_to_dict_new'

Open LRY1994 opened this issue 3 years ago • 2 comments

Environment info

  • adapter-transformers version:3.0.0
  • Platform: Unbuntu
  • Python version: 3.8
  • PyTorch version (GPU?): 1.11.0
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: no

Information

Model I am using : BartForConditionalGeneration

Language I am using the model on: English

Adapter setup I am using (if any): no

The problem arises when using:

  • torch.save(self.model, self.args.best_model_dir + "model.bin")
  • model = torch.load(self.args.best_model_dir + "model.bin")
  • Traceback (most recent call last): File "/home/simon/桌面/closed-book-prompt-qa/src/test.py", line 3, in model = torch.load( "src/temp/model_1653153641/model.bin") File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/torch/serialization.py", line 712, in load return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args) File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/torch/serialization.py", line 1046, in _load result = unpickler.load() File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/transformers/configuration_utils.py", line 253, in getattribute return super().getattribute(key) AttributeError: 'BartConfig' object has no attribute '_to_dict_new'

Expected behavior

successfully loaded for my downstream task

LRY1994 avatar May 22 '22 02:05 LRY1994

when I downgrade to version 2.3.0 , it works .

LRY1994 avatar May 22 '22 02:05 LRY1994

Hey @LRY1994, this behavior is indeed introduced by a change in v3.x of the library. The recommended way of saving & loading model checkpoints using (adapter-)transformers is via the save_pretrained()/ from_pretrained() methods:

model.save_pretrained(self.args.best_model_dir)
...
model = BartForConditionalGeneration.from_pretrained(self.args.best_model_dir)

We'll look into solving the issues using torch.save()/load() you described.

calpt avatar May 23 '22 12:05 calpt

This issue should be fixed with #406 and the next adapter-transformers release (v3.1.0).

calpt avatar Aug 24 '22 13:08 calpt