AttributeError: 'BartConfig' object has no attribute '_to_dict_new'
Environment info
-
adapter-transformersversion:3.0.0 - Platform: Unbuntu
- Python version: 3.8
- PyTorch version (GPU?): 1.11.0
- Using GPU in script?: yes
- Using distributed or parallel set-up in script?: no
Information
Model I am using : BartForConditionalGeneration
Language I am using the model on: English
Adapter setup I am using (if any): no
The problem arises when using:
- torch.save(self.model, self.args.best_model_dir + "model.bin")
- model = torch.load(self.args.best_model_dir + "model.bin")
- Traceback (most recent call last):
File "/home/simon/桌面/closed-book-prompt-qa/src/test.py", line 3, in
model = torch.load( "src/temp/model_1653153641/model.bin") File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/torch/serialization.py", line 712, in load return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args) File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/torch/serialization.py", line 1046, in _load result = unpickler.load() File "/home/simon/anaconda3/envs/qa/lib/python3.8/site-packages/transformers/configuration_utils.py", line 253, in getattribute return super().getattribute(key) AttributeError: 'BartConfig' object has no attribute '_to_dict_new'
Expected behavior
successfully loaded for my downstream task
when I downgrade to version 2.3.0 , it works .
Hey @LRY1994, this behavior is indeed introduced by a change in v3.x of the library. The recommended way of saving & loading model checkpoints using (adapter-)transformers is via the save_pretrained()/ from_pretrained() methods:
model.save_pretrained(self.args.best_model_dir)
...
model = BartForConditionalGeneration.from_pretrained(self.args.best_model_dir)
We'll look into solving the issues using torch.save()/load() you described.
This issue should be fixed with #406 and the next adapter-transformers release (v3.1.0).