memgraph-platform
memgraph-platform copied to clipboard
Add feature to select model name for Ollama
Currently memgraph supports ollama for AI chat however it hardcodes llama2:latest and no other options are available. This enhancement would allow the memgraph-lab user to select the model to use with the Ollama endpoint.
Hi @theobjectivedad, thank you for filing this feature request on GitHub 🙏 Keep track of the progress here
Posting here for folks looking for a workaround. The model can be renamed to llama2/latest on the Ollama side to get things working. The following example is verified to work in the ollama docker image:
Get the desired model:
ollama pull llama3:70b-instruct-q2_K
From within the container, rename whatever to llama2/latest:
mv /root/.ollama/models/manifests/registry.ollama.ai/library/MODEL_NAME /root/.ollama/models/manifests/registry.ollama.ai/library/llama2
mv /root/.ollama/models/manifests/registry.ollama.ai/library/llama2/TAG_NAME /root/.ollama/models/manifests/registry.ollama.ai/library/llama2/latest
Hi @theobjectivedad -- happy news! This feature is coming in Memgraph Lab v2.16 in late August 😄
This was released in Lab v2.16.0.