MiniCPM
MiniCPM copied to clipboard
[Bad Case]: ERROR occured when convert miniCPM3 model to GGUF format with llama.cpp, WHY?
Description / 描述
py convert_hf_to_gguf.py .\models\MiniCPM3-4B\ --outfile .\models\MiniCPM3-4B\CPM-4B-F16.gguf INFO:hf-to-gguf:Loading model: MiniCPM3-4B ERROR:hf-to-gguf:Model MiniCPM3ForCausalLM is not supported
How to resolve this problem?
Case Explaination / 案例解释
No response
目前的官方ollama还不支持minicpm3.0,敬请期待
Check here for the GGUF versions of MiniCPM3.