transformers
transformers copied to clipboard
How to load local code for model with `trust_remote_code=True`?
Feature request
When I use the model with trust_remote_code=True
, I cannot directly change these remote codes because everytime I load model it will request new codes from remote hub. So how can I avoid that ? Can I custom these codes in local?
example:
model = AutoModelForSeq2SeqLM.from_pretrained('THUDM/glm-large-chinese', trust_remote_code=True)
model.forward(...) # which I want to change the code
Motivation
The remote code is not always good to fit user needs. So the user should have ways to change the remote code.
Your contribution
if there is no other way I can sub a PR.
Hi @LZY-the-boys, thanks for raising this issue,
If I've understood correctly, the question being asked is how to load in a customized version of the model on the 'THUDM/glm-large-chinese' repo.
When running:
model = AutoModelForSeq2SeqLM.from_pretrained('THUDM/glm-large-chinese', trust_remote_code=True)
The model being downloaded will be the one defined in THUDM/glm-large-chinese. trust_remote_code=True
is simply saying that it's OK for this model code to be downloaded and run from the hub.
If you wish to load a local model, then this model should be saved out to either the hub or locally and the path to its location passed to from_pretrained
e.g.:
model.save_pretained('path/to/my/model') # Model with adapted methods
model = ModelClass.from_pretrained('path/to/my/model', trust_remote_code=True)
There's more information about using models with custom code here.
OK, the model.save_pretrained
indeed is a choice to custom the remote code in local folder, though it will copy these local files to a transformers/local
dir and run . In early times I change the code in that temporary directory so cause the above doubt.