transformers
transformers copied to clipboard
AutoModel(s) do not respect the `revision` flag while loading custom models
System Info
transformersversion: 4.21.1- Platform: macOS-12.4-arm64-arm-64bit
- Python version: 3.10.5
- Huggingface_hub version: 0.8.1
- PyTorch version (GPU?): 1.12.1 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: no
- Using distributed or parallel set-up in script?:no
Who can help?
No response
Information
- [ ] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below)
Reproduction
from transformers import AutoModelForImageClassification
m = AutoModelForImageClassification.from_pretrained(
"sgugger/custom-resnet50d",
trust_remote_code=True,
revision="ed94a7c6247d8aedce4647f00f20de6875b5b292"
)
# It will print:
# Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
I stepped through the code and observed that AutoConfig.from_pretrained here swallows the revision from kwargs, meaning that later on line 433 it's no longer there. I believe the same issue applies to use_auth_token.
Expected behavior
I think the revision should propagate to both the configuration and model.
cc @sgugger
Thanks for flagging! The PR linked above should solve this.
Appreciate the quick turnaround :)