llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Wrong `chat_format` documented for Moondream 2

Open joaopalmeiro opened this issue 7 months ago • 0 comments

Hi! 👋

I was testing the Server API with Moondream 2 as the model, and the documented chat_format, moondream2, threw an error:

Error code: 500 - {'error': {'message': "Invalid chat handler: moondream2 (valid formats: ['llama-2', 'llama-3', 'alpaca', 'qwen', 'vicuna', 'oasst_llama', 'baichuan-2', 'baichuan', 'openbuddy', 'redpajama-incite', 'snoozy', 'phind', 'intel', 'open-orca', 'mistrallite', 'zephyr', 'pygmalion', 'chatml', 'mistral-instruct', 'chatglm3', 'openchat', 'saiga', 'gemma', 'functionary', 'functionary-v2', 'functionary-v1', 'chatml-function-calling'])", 'type': 'internal_server_error', 'param': None, 'code': None}}

Looking at the code, it appears that the correct chat_format is moondream and not moondream2 (moondream worked as expected).

What do you think?

Let me know if I can open a PR to fix the documented chat_format, please.

Thank you for this project! 😄

joaopalmeiro avatar Apr 09 '25 21:04 joaopalmeiro