transformers
transformers copied to clipboard
I want to use 'from_ Pretrained' to read the '.safetensors' model file. What should I do?
System Info
-
transformers
version: 4.29.0.dev0 - Platform: Linux-6.2.0-20-generic-x86_64-with-glibc2.17
- Python version: 3.8.16
- Huggingface_hub version: 0.14.1
- Safetensors version: not installed
- PyTorch version (GPU?): 2.0.0+cu117 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help?
No response
Information
- [ ] The official example scripts
- [X] My own modified scripts
Tasks
- [ ] An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - [X] My own task or dataset (give details below)
Reproduction
My code: llama_config = AutoConfig.from_pretrained(llama_path + '/config.json') llama = AutoModelForCausalLM.from_pretrained(model_bytes, config = llama_config)
llama_path include: model.safetensors, config.json and other config files.
Expected behavior
I want to use 'from_ Pretrained' to read the '.safetensors' model file. What should I do?
AutoModelForCausalLM.from_pretrained(llama_path)
is enough.
AutoModelForCausalLM.from_pretrained(llama_path)
is enough.
I used your method and got an error: OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory pretrain_ models/llama_7b. models/llama_7b.
Then your comment above was wrong:
llama_path include: model.safetensors, config.json and other config files.
If you have the model.safetensors
file, from_pretrained
will succeed. Unles you don't have safetensors
installed in which case you shouldn't be able to have that file converted from the conversion script, but it's easily fixable with pip install safetensors
.
那么你上面的评论是错误的:
llama_path 包括: model.safetensors、config.json 等配置文件。
如果你有这个
model.safetensors
文件,from_pretrained
就会成功。除非你没有safetensors
安装,在这种情况下你不应该能够从转换脚本转换该文件,但它很容易用pip install safetensors
.
I install safetensors and use following code: AutoModelForCausalLM.from_pretrained(llama_path) and then, I got a new error: AttributeError: 'NoneType' object has no attribute 'get' ? Is it the reason for my Transformers version? I am using pip install git+ https://github.com/huggingface/transformers The method of downloading is not directly 'pip install transformers'. Because when I directly 'pip install transformers', I have problems with from transformers import LlamaForCausalLM, LlamaTokenizer.
I'm sure the path contain the model.safetensors file
Same Issue Here.
I Want to Use The Model "wojtab/llava-7b-v0-4bit-128g" using from_pretrained()
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Got a Soution!
Checkout AUTOGPTQ.
@TheFaheem Sorry, may I know how to solve this problem?
@TheFaheem Sorry, may I know how to solve this problem?
Check it out Here => https://github.com/PanQiWei/AutoGPTQ
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.