llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Running server gives error when using huggingface model

Open djaffer opened this issue 2 years ago • 3 comments

850 return issubclass(cls.origin, self.origin) 851 if not isinstance(cls, _GenericAlias): --> 852 return issubclass(cls, self.origin) 853 return super().subclasscheck(cls)

TypeError: issubclass() arg 1 must be a class

The manual package works with the model not huggingface

https://huggingface.co/eachadea/ggml-vicuna-13b-4bit/resolve/main/ggml-vicuna-13b-4bit-rev1.bin

djaffer avatar Apr 08 '23 20:04 djaffer

Are you saying that you're trying to load the model from hugging face? Or that you downloaded it from hugging face?

Because that file is not in the hugging face format, it's in GGML. What specifically are you running that's giving you the type error? This reminds me of an error I was getting when I didn't have the correct pydantic version.

https://github.com/abetlen/llama-cpp-python/pull/29#issuecomment-1498248609

MillionthOdin16 avatar Apr 11 '23 14:04 MillionthOdin16

Downloaded from huggingface.

pip install llama-cpp-python[server] export MODEL=./models/7B/ggml-model.bin python3 -m llama_cpp.server

The solution is to install pydantic==1.10.7 ? Didn't get a chance. I thought that is what the comment said https://github.com/abetlen/llama-cpp-python/pull/29#issuecomment-1498248609.

djaffer avatar Apr 11 '23 16:04 djaffer

I'm not sure what the solution is, did you try it and get it work? I had a similar issue and I think it did fix mine.

MillionthOdin16 avatar Apr 12 '23 05:04 MillionthOdin16

Can this issue be closed?

gjmulder avatar May 12 '23 17:05 gjmulder

no just left it. Didn't have much time to continue play around.

djaffer avatar May 13 '23 04:05 djaffer

Please reopen when you have time.

gjmulder avatar May 15 '23 11:05 gjmulder