alpaca.cpp icon indicating copy to clipboard operation
alpaca.cpp copied to clipboard

No models work except the standard one - bad magic

Open NeiroNext opened this issue 1 year ago • 6 comments

Does not work with any model other than the one that is attached in the description. I downloaded 4 pieces of 13b models, each time a bad magiс error, although the format is the same in the description of the model for alpaca, I downloaded 30b Model - bad magic, before the working 7b model I also had to download 5 non-working pieces.

Is this a bug, will it be fixed in the future, why are these models in the description suitable for alpace, but in reality they do not work, is this a bug in the new version? Throw off a working model, at least 13b, I already gave up, nothing works except the standard one.

NeiroNext avatar Apr 05 '23 19:04 NeiroNext

i'm having the same issue unfortunately, i've made no progress either. i'm trying to use ggml-vicuna-7b-4bit

srevill avatar Apr 10 '23 16:04 srevill

where can I download those models? why is nothing specified in the docs?

demian85 avatar Apr 10 '23 19:04 demian85

where can I download those models? why is nothing specified in the docs?

https://medium.com/@martin-thissen/vicuna-on-your-cpu-gpu-best-free-chatbot-according-to-gpt-4-c24b322a193a I ended up figuring it out by following this guide, you have to run the python code that this mentions to get the models downloaded correctly.

srevill avatar Apr 10 '23 21:04 srevill

I've got the same issue. I have 7B working via chat_mac.sh but it can't see other models except 7B. I've even tried renaming 13B in the same way as 7B but got "Bad magic". In other cases it searches for 7B model and says

"llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: failed to open 'ggml-alpaca-7b-q4.bin' main: failed to load model from 'ggml-alpaca-7b-q4.bin'".

I also tried ./chat_mac -m alpaca-13b-ggml-q4_0-lora-merged/ggml-model-q4_0.bin but got the same "Bad magic error"

ASX320 avatar Apr 13 '23 10:04 ASX320

I've got the same issue. I have 7B working via chat_mac.sh but it can't see other models except 7B. I've even tried renaming 13B in the same way as 7B but got "Bad magic". In other cases it searches for 7B model and says

I also tried ./chat_mac -m alpaca-13b-ggml-q4_0-lora-merged/ggml-model-q4_0.bin but got the same "Bad magic error"

I found this check (bad magic) in the source code, removed it, but this did not fix the situation, unfortunately a special model is needed there.

NeiroNext avatar Apr 13 '23 14:04 NeiroNext

Does not work with any model other than the one that is attached in the description. I downloaded 4 pieces of 13b models, each time a bad magiс error, although the format is the same in the description of the model for alpaca, I downloaded 30b Model - bad magic, before the working 7b model I also had to download 5 non-working pieces.

Is this a bug, will it be fixed in the future, why are these models in the description suitable for alpace, but in reality they do not work, is this a bug in the new version? Throw off a working model, at least 13b, I already gave up, nothing works except the standard one.

I found this file: https://github.com/antimatter15/alpaca.cpp/pull/137/commits/af9ab4a38d5e3d4ff0dc4219fd1c5b68b0d689fc There are the links to download 7B,13B and 30B. That helped me.

ASX320 avatar Apr 14 '23 11:04 ASX320