memgraph-platform icon indicating copy to clipboard operation
memgraph-platform copied to clipboard

Add feature to select model name for Ollama

Open theobjectivedad opened this issue 1 year ago • 3 comments

Currently memgraph supports ollama for AI chat however it hardcodes llama2:latest and no other options are available. This enhancement would allow the memgraph-lab user to select the model to use with the Ollama endpoint.

theobjectivedad avatar Apr 30 '24 14:04 theobjectivedad

Hi @theobjectivedad, thank you for filing this feature request on GitHub 🙏 Keep track of the progress here

katarinasupe avatar Apr 30 '24 15:04 katarinasupe

Posting here for folks looking for a workaround. The model can be renamed to llama2/latest on the Ollama side to get things working. The following example is verified to work in the ollama docker image:

Get the desired model:

ollama pull llama3:70b-instruct-q2_K

From within the container, rename whatever to llama2/latest:

mv /root/.ollama/models/manifests/registry.ollama.ai/library/MODEL_NAME /root/.ollama/models/manifests/registry.ollama.ai/library/llama2

mv /root/.ollama/models/manifests/registry.ollama.ai/library/llama2/TAG_NAME /root/.ollama/models/manifests/registry.ollama.ai/library/llama2/latest

theobjectivedad avatar Apr 30 '24 15:04 theobjectivedad

Hi @theobjectivedad -- happy news! This feature is coming in Memgraph Lab v2.16 in late August 😄

katarinasupe avatar Aug 07 '24 07:08 katarinasupe

This was released in Lab v2.16.0.

tonilastre avatar Apr 01 '25 13:04 tonilastre