moondream icon indicating copy to clipboard operation
moondream copied to clipboard

LM Studio / Llamafile / ollama integration

Open vikhyat opened this issue 1 year ago • 1 comments

Integrating the model with LM Studio and Llamafile would make it more accessible for users.

vikhyat avatar Jan 24 '24 23:01 vikhyat

Opened issue @ https://github.com/ollama/ollama/issues/2462 and investigating.

Error output bellow

Loading model: model Traceback (most recent call last): File "/home/questsin/repo/ollama/llm/llama.cpp/convert-hf-to-gguf.py", line 1612, in main() File "/home/questsin/repo/ollama/llm/llama.cpp/convert-hf-to-gguf.py", line 1593, in main model_instance = model_class(dir_model, ftype_map[args.outtype], fname_out, args.bigendian) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/questsin/repo/ollama/llm/llama.cpp/convert-hf-to-gguf.py", line 57, in init self.model_arch = self._get_model_architecture() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/questsin/repo/ollama/llm/llama.cpp/convert-hf-to-gguf.py", line 262, in _get_model_architecture raise NotImplementedError(f'Architecture "{arch}" not supported!') NotImplementedError: Architecture "Moondream" not supported!

questsin avatar Feb 12 '24 13:02 questsin