alpaca.cpp icon indicating copy to clipboard operation
alpaca.cpp copied to clipboard

65B Llama model doesn't work - bad magic??

Open Troceleng opened this issue 1 year ago • 3 comments

I downloaded the torrent of the available Llama models and converted the 65B model to .bin in order to use it in Alpaca. However, when I try to load the .bin file in, it simply does not work. It says "llama_model_load: loading model from 'models/ggml-model-f16.bin' (bad magic) and fails to load the model.

What should I do in this case? I'm very curious to try a 65B model with Alpaca.

Troceleng avatar Apr 01 '23 21:04 Troceleng

I have the same problem with "gpt4-x-alpaca-native-13B-ggml", but tbh I have no Idea, if it's correct model for this thing or not or what's the difference between alpaca-native and alpaca-lora. The informations are terribly scattered on million places, I would really appreciate a proper guide with a description of the difference between the different models.

LipcaCZ avatar Apr 03 '23 07:04 LipcaCZ

I found this link, probably it helps you but I don't that answer is not for MacOS. Link:https://github.com/antimatter15/alpaca.cpp/issues/121

As for me, I have 7B working via chat_mac.sh but it can't see other models except 7B. I've even tried renaming 13B in the same way as 7B but got "Bad magic". In other cases it searches for 7B model and says

"llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: failed to open 'ggml-alpaca-7b-q4.bin' main: failed to load model from 'ggml-alpaca-7b-q4.bin'".

I also tried ./chat_mac -m alpaca-13b-ggml-q4_0-lora-merged/ggml-model-q4_0.bin but got the same "Bad magic error"

ASX320 avatar Apr 13 '23 11:04 ASX320

I started using "alpaca-turbo" instead of this, which works, but at least on my computer it's terribly slow. I think I'll give it a few weeks or months until the childhood illnesses are caught and start playing with it then.

LipcaCZ avatar Apr 13 '23 11:04 LipcaCZ