ctransformers icon indicating copy to clipboard operation
ctransformers copied to clipboard

Loading local QPTQ LLM from safetensors

Open tmwstw7 opened this issue 2 years ago • 2 comments

I'm trying to load TheBloke/Llama-2-7b-Chat-GPTQ from the local directory with the sample code provided here:

from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained("./my_folder/TheBloke/Llama-2-7B-GPTQ", model_type="gptq") 

but it seems like .from_pretrained is looking for .bin model rather than .safetensors, thus it is unable to load , model with an error like: "No model file found in directory". I tried to pass use_safetensors but as I saw from the definition of a method, it doesn't have this parameter. Can somebody guide me on what should I do to be able to load GPTQ LLMs from local path?

tmwstw7 avatar Aug 14 '23 08:08 tmwstw7

Please post the full error message with stack trace and make sure you are using the latest version:

pip install ctransformers --upgrade

marella avatar Aug 15 '23 11:08 marella

Here is the error: image

and the content of model folder: image

ctransformers version: 0.2.17, installed with [gptq]

tmwstw7 avatar Oct 23 '23 11:10 tmwstw7