transformers
transformers copied to clipboard
`.to_dict` does not correctly serialize `torch.dtype` in some cases (e.g., vision models)
System Info
-
transformers
version: 4.29.1 - Platform: Windows-10
- Python version: 3.8.3
- Huggingface_hub version: 0.14.1
- Safetensors version: 0.3.1
- PyTorch version (GPU?): 2.0.0+cu117 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: no
- Using distributed or parallel set-up in script?: no
Who can help?
@sgugger @ArthurZucker @amyeroberts
Information
- [ ] The official example scripts
- [X] My own modified scripts
Tasks
- [ ] An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below)
Reproduction
Running:
import json
from transformers import AutoConfig
json_data = AutoConfig.from_pretrained('openai/clip-vit-base-patch16').to_dict()
json.dumps(json_data, indent=4)
Results in
TypeError: Object of type dtype is not JSON serializable
I have identified this problem with the following models:
-
clip
-
sam
-
vision-encoder-decoder
Expected behavior
torch dtypes should be converted to a string. I believe this is due to these configs redefining their to_dict
method, without calling dict_torch_dtype_to_str
on the top-level object.
https://github.com/huggingface/transformers/blob/de9255de27abfcae4a1f816b904915f0b1e23cd9/src/transformers/models/clip/configuration_clip.py#L397-L408
Hey! Indeed, the PretrainedConfig
class calls dict_torch_dtype_to_str
, and the text_config
and vision_config
inherit from it, so they work fine, indeed, the parent's torch_dtype
attribute can be modified and we don't use self.to_dict()
. Thanks for reporting
The configs should be automatically tested IMO, this is currently note the case. It seems that for blip, only the text config is tested, which is why this does not fail. 10 models or more are concerned (mostly when is_composition=True
.
I'll open a PR to fix this
Commenting to prevent it being closed as stale.
Yep, sorry I'll try to get to the original fix taking the comment into account!
This was fixed by #25237