Bug: MCP server add_nodes tool call does not work everything else works fine
This is the repo for testing
https://github.com/abab-dev/test-graphiti-mcp-server
Instructions in the repo
how to run this
to run the service
git clone https://github.com/getzep/graphiti.git
cd graphiti/mcp_server
OPENAI_BASE_URL=http://localhost:5000 docker compose up
now
git clone https://github.com/abab-dev/test-graphiti-mcp-server.git
cd test-graphiti-mcp-server
uv run server.py
uv run mcp_sse_test.py
insertion works directly with gemini embedder and client
but calls to mcp server does not work
what might be problem
The error always pops up for add_nodes tool call
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: 127.0.0.1:42964 - "GET /sse HTTP/1.1" 200 OK
INFO: 127.0.0.1:42978 - "POST /messages/?session_id=9461b1a0d35d46a99702cb86ea26d585 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:42978 - "POST /messages/?session_id=9461b1a0d35d46a99702cb86ea26d585 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:42978 - "POST /messages/?session_id=9461b1a0d35d46a99702cb86ea26d585 HTTP/1.1" 202 Accepted
2025-05-21 11:10:08,061 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest
2025-05-21 11:10:08,062 - __main__ - INFO - Starting episode queue worker for group_id: test_graph_group
2025-05-21 11:10:08,062 - __main__ - INFO - Processing queued episode 'Customer Conversation' for group_id: test_graph_group
2025-05-21 11:10:10,045 - __main__ - ERROR - Error processing episode 'Customer Conversation' for group_id test_graph_group: node 630f7a7e-3c0b-49c3-9690-fc04e65aced1 not found
Can you try remove the space in the episode name?
I have cleaned up the test function on my repo and removed the spaces in name.
It still gives the same error.
But after I add directly to db without using mcp server , and directly using geminiembedder and geminillm it get added
and searching and clear graph works as well.
I am using latest mcp server (after the fixes you have pushed) and I am running it locally now without docker
I have OpenAI compatible flask shim which is in server.py
2025-05-22 12:21:33,233 - __main__ - INFO - Processing queued episode 'CustomerProfile' for group_id: test_graph_group
2025-05-22 12:21:33,266 - __main__ - ERROR - Error processing episode 'CustomerProfile' for group_id test_graph_group: node ca65bf70-7fad-4211-ad69-faa659a51840 not found
INFO: 127.0.0.1:57798 - "GET /sse HTTP/1.1" 200 OK
INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted
2025-05-22 12:21:33,302 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest
2025-05-22 12:21:33,303 - __main__ - INFO - Processing queued episode 'CustomerConversation' for group_id: test_graph_group
2025-05-22 12:21:33,331 - __main__ - ERROR - Error processing episode 'CustomerConversation' for group_id test_graph_group: node be2b309f-c617-4afe-8c29-6564415dec31 not found
It wont work with Wsl
Sent from Outlook for Androidhttps://aka.ms/AAb9ysg
From: abab-dev @.> Sent: Thursday, May 22, 2025 8:55:09 AM To: getzep/graphiti @.> Cc: Subscribed @.***> Subject: Re: [getzep/graphiti] Bug: MCP server add_nodes tool call does not work everything else works fine (Issue #509)
[https://avatars.githubusercontent.com/u/146825408?s=20&v=4]abab-dev left a comment (getzep/graphiti#509)https://github.com/getzep/graphiti/issues/509#issuecomment-2900121980
I have cleaned up the test function on my repo and removed the spaces in name. It still gives the same error. I am using latest mcp server (after the fixes you have pushed) and I am running it locally now without docker I have OpenAI compatible flask shim which is in server.py
2025-05-22 12:21:33,233 - main - INFO - Processing queued episode 'CustomerProfile' for group_id: test_graph_group 2025-05-22 12:21:33,266 - main - ERROR - Error processing episode 'CustomerProfile' for group_id test_graph_group: node ca65bf70-7fad-4211-ad69-faa659a51840 not found INFO: 127.0.0.1:57798 - "GET /sse HTTP/1.1" 200 OK INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted INFO: 127.0.0.1:57802 - "POST /messages/?session_id=a323ef84819741179629af88516edc24 HTTP/1.1" 202 Accepted 2025-05-22 12:21:33,302 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest 2025-05-22 12:21:33,303 - main - INFO - Processing queued episode 'CustomerConversation' for group_id: test_graph_group 2025-05-22 12:21:33,331 - main - ERROR - Error processing episode 'CustomerConversation' for group_id test_graph_group: node be2b309f-c617-4afe-8c29-6564415dec31 not found
— Reply to this email directly, view it on GitHubhttps://github.com/getzep/graphiti/issues/509#issuecomment-2900121980, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ACUEYCNTYJFOHTH3DHUT3JL27VYE3AVCNFSM6AAAAAB5SIBIXCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDSMBQGEZDCOJYGA. You are receiving this because you are subscribed to this thread.Message ID: @.***>
@abab-dev Are you using a Gemini model here?
@danielchalef Yes I am using gemini.
@danielchalef tested again like this
import asyncio
import json
import logging
import os
import time # For polling
import uuid
from datetime import datetime, timezone
from logging import INFO
from typing import Any, Dict, List
from fastmcp import Client
from fastmcp.client import SSETransport
from graphiti_core import Graphiti
from graphiti_core.embedder.gemini import GeminiEmbedder, GeminiEmbedderConfig
from graphiti_core.embedder.openai import OpenAIEmbedder, OpenAIEmbedderConfig
from graphiti_core.llm_client.gemini_client import GeminiClient, LLMConfig
from graphiti_core.llm_client.openai_client import OpenAIClient
from graphiti_core.nodes import EpisodeType
from graphiti_core.search.search_config_recipes import NODE_HYBRID_SEARCH_RRF
from mcp.types import TextContent # Need this for parsing response
async def add_direct():
"""Initializes Graphiti with Gemini clients and builds indices."""
api_key = os.environ.get("GOOGLE_API_KEY")
if not api_key or not api_key.strip(): # Added check for empty string
print("GOOGLE_API_KEY environment variable must be set and not empty")
raise ValueError(
"GOOGLE_API_KEY environment variable must be set and not empty"
)
graphiti = None
try:
# Initialize Graphiti with Gemini clients
graphiti = Graphiti(
"bolt://localhost:7687",
"neo4j",
"demodemo",
llm_client=OpenAIClient(
config=LLMConfig(
base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
api_key=api_key,
model="gemini-2.0-flash",
small_model="gemini-2.0-flash-lite",
)
),
embedder=OpenAIEmbedder(
config=OpenAIEmbedderConfig(
base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
api_key=api_key,
embedding_model="embedding-001",
),
),
)
print("Graphiti initialized successfully")
# Initialize the graph database with graphiti's indices. This only needs to be done once.
print("Building indices and constraints...")
await graphiti.build_indices_and_constraints()
print("Indices and constraints built successfully")
# Additional code will go here
except Exception as e:
print("An error occurred during Graphiti operations:")
raise # Re-raise the exception after logging
episodes = [
{
"name": "GoogleProfile",
"episode_body": '{\\"company\\": {\\"name\\": \\"Google is a consumer internet company\\"}, }',
"source": EpisodeType.json,
"source_description": "CRM data",
},
{
"name": "CustomerConversation",
"episode_body": "user: What's your return policy?\nassistant: You can return items within 30 days.",
"source": EpisodeType.text,
"source_description": "chat transcript",
"group_id": "some_arbitrary_string",
},
]
# Add episodes to the graph
for i, episode in enumerate(episodes[0:1]):
await graphiti.add_episode(
name=episode["name"],
episode_body=(
episode["episode_body"]
if isinstance(episode["episode_body"], str)
else json.dumps(episode["episode_body"])
),
source=episode["source"],
source_description=episode["source_description"],
reference_time=datetime.now(timezone.utc),
)
print(f"Added episode: ({episode['name']})")
async def main():
resp = await add_direct()
print(resp)
Works perfectly fine with one problem
if I replace "source":"json" or "souce":"text" insertion fails with error
File "/.venv/lib/python3.12/site-packages/graphiti_core/graphiti.py", line 333, in add_episode
await self.retrieve_episodes(
File "/.venv/lib/python3.12/site-packages/graphiti_core/graphiti.py", line 262, in retrieve_episodes
return await retrieve_episodes(self.driver, reference_time, last_n, group_ids, source)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.venv/lib/python3.12/site-packages/graphiti_core/utils/maintenance/graph_data_operations.py", line 164, in retrieve_episodes
source=source.name if source is not None else None,
^^^^^^^^^^^
AttributeError: 'str' object has no attribute 'name'
now in mcp server I do it like this
https://github.com/getzep/graphiti/blob/d174abb2ba3ab92bb6ac20fd97e8925284f74d6e/mcp_server/graphiti_mcp_server.py#L40-L43
Replace this with
DEFAULT_LLM_MODEL = 'gemini-2.0-flash'
SMALL_LLM_MODEL = 'gemini-2.0-flash-lite'
DEFAULT_EMBEDDER_MODEL = 'embedding-001'
and in .env file I replace it with this
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=demodemo
OPENAI-API-KEY=<gemini-api-key>
MODEL_NAME=gemini-2.0-flash
OPENAI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai/
DEFAULT_EMBEDDER_MODEL=embedding-001
add_memory fails with error
INFO: 127.0.0.1:33164 - "GET /sse HTTP/1.1" 200 OK
INFO: 127.0.0.1:33174 - "POST /messages/?session_id=dc854340e00e4c65ae5926aba5d0068d HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:33174 - "POST /messages/?session_id=dc854340e00e4c65ae5926aba5d0068d HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:33174 - "POST /messages/?session_id=dc854340e00e4c65ae5926aba5d0068d HTTP/1.1" 202 Accepted
2025-05-23 22:13:02,179 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest
2025-05-23 22:13:02,180 - __main__ - INFO - Processing queued episode 'CustomerProfile1' for group_id: test_graph_group
2025-05-23 22:13:02,211 - __main__ - ERROR - Error processing episode 'CustomerProfile1' for group_id test_graph_group: node 92cddb5a-431f-4469-a0d8-da953d326c40 not found
but searching works
INFO: 127.0.0.1:39418 - "GET /sse HTTP/1.1" 200 OK
INFO: 127.0.0.1:39420 - "POST /messages/?session_id=88d341c22d4745f39087aa1713fad968 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:39420 - "POST /messages/?session_id=88d341c22d4745f39087aa1713fad968 HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:39420 - "POST /messages/?session_id=88d341c22d4745f39087aa1713fad968 HTTP/1.1" 202 Accepted
2025-05-23 22:14:45,639 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest
2025-05-23 22:14:47,198 - httpx - INFO - HTTP Request: POST https://generativelanguage.googleapis.com/v1beta/openai/embeddings "HTTP/1.1 200 OK"
@abab-dev Is this still an issue? Please confirm within 14 days or this issue will be closed.
@abab-dev Is this still an issue? Please confirm within 14 days or this issue will be closed.
@abab-dev Is this still an issue? Please confirm within 14 days or this issue will be closed.
@abab-dev Is this still an issue? Please confirm within 14 days or this issue will be closed.