moondream
moondream copied to clipboard
Empty response on ollama
most of times responses are empty when using continuously with ollama. (fp16)
Would not recommend using the ollama version right now, it only supports a very old version of the model (from April). I need to reach out to them to figure out a path forward because I heard that llama.cpp might be dropping support for vision language models.
We have a Python client we're developing -- https://github.com/vikhyat/moondream/tree/main/clients/python -- only supports CPU for now but we're working on adding MPS and CUDA support as soon as possible./