Are there any plans to add support for 13B and beyond?
I tested 13B and 30B without problems. Just download the models and start the program with the correct file, e.g.:
-m ggml-alpaca-30b-q4.bin
There was in the README the option to choose between 3 models and it disappeared a couple of weeks ago. I found the 13b model, I wasn't able to find the 30b. I am still looking.
@supportend could you please link the bin files for 13B and 30B?
how many tokens/s are you getting with the larger models?
Download of the 30B Model: https://huggingface.co/Pi3141/alpaca-lora-30B-ggml/tree/main 4_1 does not work for me, i use the ggml-model-q4_0.bin file and renamed it to ggml-alpaca-30b-q4.bin.
It's the same file, that was downloaded inside the Dalai docker container:
sudo docker compose run dalai md5sum /root/dalai/alpaca/models/30B/ggml-model-q4_0.bin && md5sum /daten/alpaca/ggml-alpaca-30b-q4.bin
7bc55515cb128cd2ded9403d35c25244 /root/dalai/alpaca/models/30B/ggml-model-q4_0.bin
7bc55515cb128cd2ded9403d35c25244 /daten/alpaca/ggml-alpaca-30b-q4.bin
My system is a notebook and i reduced the cpu frequency a little, one token tooks over 600 ms.
could you please link the bin files for 13B
Download of the 30B Model: https://huggingface.co/Pi3141/alpaca-lora-30B-ggml/tree/main 4_1 does not work for me, i use the ggml-model-q4_0.bin file and renamed it to ggml-alpaca-30b-q4.bin.
It's the same file, that was downloaded inside the Dalai docker container:
sudo docker compose run dalai md5sum /root/dalai/alpaca/models/30B/ggml-model-q4_0.bin && md5sum /daten/alpaca/ggml-alpaca-30b-q4.bin 7bc55515cb128cd2ded9403d35c25244 /root/dalai/alpaca/models/30B/ggml-model-q4_0.bin 7bc55515cb128cd2ded9403d35c25244 /daten/alpaca/ggml-alpaca-30b-q4.binMy system is a notebook and i reduced the cpu frequency a little, one token tooks over 600 ms.
could you please link the bin files for 13B???
I think, this is it:
https://huggingface.co/Pi3141/alpaca-lora-13B-ggml/tree/main again 4_0 file.
There is a native version too on the site. But i dit not download a 4_1 file or the.
I think, this is it:
https://huggingface.co/Pi3141/alpaca-lora-13B-ggml/tree/main again 4_0 file.
There is a native version too on the site. But i dit not download a 4_1 file or the.
thanks a lot i download it