Daniel Chalef
Daniel Chalef
Duplicate of https://github.com/getzep/graphiti/issues/333
Regarding support for OpenRouter and Voyage: As you've discovered, the MCP server does not yet support `OpenAIGenericClient` and the Voyage embedder. What model are you using with OpenRouter? The inference...
I believe I addressed this yesterday in #512. Please would you pull the latest from the `main` branch and let us know how you do?
This is described in the #558 kindly contributed by @PrettyWood
You're correct. Our recommendation is to use two different clients. I'll update the docs. Thank you.
@luisgithub269 You should be able to use Graphiti with any OpenAI-compatible inference endpoint. Just provide the correct base_url, key, and model name in a [`LLMConfig`](https://github.com/getzep/graphiti/blob/main/graphiti_core/llm_client/config.py) that you pass to the...
All contributors have signed the CLA ✍️ ✅Posted by the ****CLA Assistant Lite bot****.
Have you tried reducing `SEMAPHORE_LIMIT` in your environment? This defaults to 20. Additionally, we've recently made improvements to Graphiti to reduce the number and size of of LLM calls made....
We plan to add custom edge types. cc @prasmussen15
@jackaldenryan Popping documenting Edge Types on your TODO list.