Bunny
Bunny copied to clipboard
Convert Bunny-v1.0-3B to GGUF
I have tested using the HF DEMO and found that the results of Bunny-v1.1-Llama-3-8B-V and Bunny-v1.0-3B are what I am looking for. However, I discovered that llama.cpp does not currently support S2-Wrapper, so I want to convert Bunny-v1.0-3B to GGUF for use on edge devices (I have tested Bunny-v1_0-4B.gguf and the results were not ideal).
To convert Bunny-v1_0-3B to gguf, I follow the instructions on the GitHub page. However, when I execute the final step:
python ../../convert-hf-to-gguf.py Bunny-v1_0-3B
But I encounter the error:
KeyError: "could not find any of: ['rms_norm_eps']"
along with several other missing format in the config.
I think that the configs for Bunny-v1_0-3B and Bunny-v1_0-4B are different, which causes the error when loading the model. Could you please provide the config.json for Bunny-v1_0-3B or a solution to this issue?