transformers icon indicating copy to clipboard operation
transformers copied to clipboard

AutoModel(s) do not respect the `revision` flag while loading custom models

Open ankrgyl opened this issue 3 years ago • 3 comments

System Info

  • transformers version: 4.21.1
  • Platform: macOS-12.4-arm64-arm-64bit
  • Python version: 3.10.5
  • Huggingface_hub version: 0.8.1
  • PyTorch version (GPU?): 1.12.1 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: no
  • Using distributed or parallel set-up in script?:no

Who can help?

No response

Information

  • [ ] The official example scripts
  • [ ] My own modified scripts

Tasks

  • [ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • [ ] My own task or dataset (give details below)

Reproduction


from transformers import AutoModelForImageClassification
m = AutoModelForImageClassification.from_pretrained(
        "sgugger/custom-resnet50d",
        trust_remote_code=True,
        revision="ed94a7c6247d8aedce4647f00f20de6875b5b292"
)
# It will print:
# Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.

I stepped through the code and observed that AutoConfig.from_pretrained here swallows the revision from kwargs, meaning that later on line 433 it's no longer there. I believe the same issue applies to use_auth_token.

Expected behavior

I think the revision should propagate to both the configuration and model.

ankrgyl avatar Aug 09 '22 00:08 ankrgyl

cc @sgugger

LysandreJik avatar Aug 09 '22 08:08 LysandreJik

Thanks for flagging! The PR linked above should solve this.

sgugger avatar Aug 09 '22 15:08 sgugger

Appreciate the quick turnaround :)

ankrgyl avatar Aug 09 '22 16:08 ankrgyl