graphiti icon indicating copy to clipboard operation
graphiti copied to clipboard

add OllamaClient

Open gsw945 opened this issue 4 months ago • 8 comments

Summary

  • Add llm_client OllamaClient
  • Add embedder OllamaEmbedder

Type of Change

  • [x] Bug fix
  • [ ] New feature
  • [ ] Performance improvement
  • [ ] Documentation/Tests

Objective

For new features and performance improvements: Clearly describe the objective and rationale for this change.

Testing

  • [x] Unit tests added/updated
  • [ ] Integration tests added/updated
  • [x] All existing tests pass

Breaking Changes

  • [ ] This PR contains breaking changes

If this is a breaking change, describe:

  • What functionality is affected
  • Migration path for existing users

Checklist

  • [x] Code follows project style guidelines (make lint passes)
  • [x] Self-review completed
  • [x] Documentation updated where necessary
  • [x] No secrets or sensitive information committed

Related Issues

Closes #868,#912

gsw945 avatar Sep 09 '25 12:09 gsw945

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

danielchalef avatar Sep 09 '25 12:09 danielchalef

I have read the CLA Document and I hereby sign the CLA.

gsw945 avatar Sep 09 '25 12:09 gsw945

make lint: 图片

python -m pytest -xvs tests/llm_client/test_ollama_client.py: 图片

gsw945 avatar Sep 09 '25 12:09 gsw945

Here is a small example, taken from part of Build a ShoeBot Sales Agent using LangGraph and Graphiti.
The relevant data has been anonymised.

graphiti-agent-debug.py:

import asyncio
import json
import logging
import os
import sys
import uuid
from contextlib import suppress
from datetime import datetime, timezone
from pathlib import Path
from typing import Annotated
from dotenv import load_dotenv
from typing_extensions import TypedDict
from graphiti_core.llm_client.config import LLMConfig
# from graphiti_core.llm_client.openai_client import OpenAIClient
# from graphiti_core.llm_client.openai_generic_client import OpenAIGenericClient
from graphiti_core.llm_client.ollama_client import OllamaClient
# from graphiti_core.embedder.openai import OpenAIEmbedder, OpenAIEmbedderConfig
from graphiti_core.embedder.ollama import OllamaEmbedder, OllamaEmbedderConfig
from graphiti_core.cross_encoder.openai_reranker_client import OpenAIRerankerClient
from graphiti_core import Graphiti
from graphiti_core.edges import EntityEdge
from graphiti_core.nodes import EpisodeType
from graphiti_core.utils.maintenance.graph_data_operations import clear_data
from graphiti_core.search.search_config_recipes import NODE_HYBRID_SEARCH_EPISODE_MENTIONS


def setup_logging():
    logger = logging.getLogger()
    logger.setLevel(logging.ERROR)
    console_handler = logging.StreamHandler(sys.stdout)
    console_handler.setLevel(logging.INFO)
    formatter = logging.Formatter('%(name)s - %(levelname)s - %(message)s')
    console_handler.setFormatter(formatter)
    logger.addHandler(console_handler)
    return logger


async def main():
    load_dotenv()
    logger = setup_logging()

    # Configure Ollama LLM client
    llm_config = LLMConfig(
        api_key="ollama",  # Ollama doesn't require a real API key
        model="qwen3:4b",
        small_model="qwen3:4b",
        base_url="http://127.0.0.1:11434/v1",  # Ollama provides this port
        max_tokens=8192,
    )
    llm_client = OllamaClient(config=llm_config)
    embedder = OllamaEmbedder(
        config=OllamaEmbedderConfig(
            api_key="ollama",
            embedding_model="bge-m3:567m",
            embedding_dim=1024,
            base_url="http://127.0.0.1:11434/v1",
        )
    )
    cross_encoder = OpenAIRerankerClient(client=llm_client, config=llm_config)

    neo4j_uri = os.environ.get('NEO4J_URI', 'bolt://10.98.8.113:7687')
    neo4j_user = os.environ.get('NEO4J_USER', 'neo4j')
    neo4j_password = os.environ.get('NEO4J_PASSWORD', 'xxxxxx')

    client = Graphiti(
        neo4j_uri,
        neo4j_user,
        neo4j_password,
        llm_client=llm_client,
        embedder=embedder,
        cross_encoder=cross_encoder
    )

    # Note: This will clear the database
    await clear_data(client.driver)
    await client.build_indices_and_constraints()

    user_name = 'jess'
    await client.add_episode(
        name='User Creation',
        episode_body=(f'{user_name} is interested in buying a pair of shoes'),
        source=EpisodeType.text,
        reference_time=datetime.now(timezone.utc),
        source_description='SalesBot',
    )

    # let's get Jess's node uuid
    nl = await client._search(user_name, NODE_HYBRID_SEARCH_EPISODE_MENTIONS)
    print(nl)

    # and the ManyBirds node uuid
    nl = await client._search('ManyBirds', NODE_HYBRID_SEARCH_EPISODE_MENTIONS)
    print(nl)


if __name__ == "__main__":
    asyncio.run(main())
图片

gsw945 avatar Sep 09 '25 12:09 gsw945

recheck

gsw945 avatar Sep 13 '25 14:09 gsw945

Can't wait to see this merged!

ing-norante avatar Sep 15 '25 12:09 ing-norante

Hi there - this client is duplicative of the Graphiti GenericOpenAIClient. Can you share why this is necessary?

danielchalef avatar Nov 14 '25 16:11 danielchalef

Hi there - this client is duplicative of the Graphiti GenericOpenAIClient. Can you share why this is necessary?

I hope you have had an opportunity to review issues #868 and #912, particularly the discussions within #868.

When creating this pull request, I attempted to utilise Ollama as the LLM. Even when employing the GenericOpenAIClient, the examples you provided failed to execute successfully. The specific reasons for this are outlined in the aforementioned issues.

However, I've noticed subsequent updates to GenericOpenAIClient and am uncertain whether Ollama now functions correctly within Graphiti. I shall verify this in due course. Should GenericOpenAIClient enable Ollama to operate properly within Graphiti, I shall close this pull request, as it has fallen considerably behind the main branch.

gsw945 avatar Nov 17 '25 03:11 gsw945