langchain-ask-pdf-local icon indicating copy to clipboard operation
langchain-ask-pdf-local copied to clipboard

No file named stable-vicuna-13B.ggml.q4_2.bin in linked Huggingface repo

Open uwts opened this issue 2 years ago • 2 comments

The readme instructs the user to download stable-vicuna-13B.ggml.q4_2.bin from a linked repo. That file does not appear in the repo.

uwts avatar May 12 '23 17:05 uwts

It turns out that the llama was updated, and stable-vicuna images was re-quantised. I can't check right now if it's still working, but I have fixed versions in this repo's requirements.txt, and previous model images are available there https://huggingface.co/TheBloke/stable-vicuna-13B-GGML/tree/previous_llama

Alternatively, you can try downloading one of the new q4* or q5* images, install the latest llama-cpp-python, and change the image file name in app.py accordingly.

Whatever you will go with, please let me know if it works or not so I can update the repository.

wafflecomposite avatar May 12 '23 19:05 wafflecomposite

This is their latest stable-vicuna-13B.ggmlv3.q8_0.bin (12.9 GB)

LebToki avatar Jun 11 '23 14:06 LebToki