running example failed
2025-04-07 16:25:08 - graphiti_core.llm_client.openai_client - ERROR - Error in generating LLM response: Error code: 400 - {'object': 'error', 'message': "[{'type': 'string_type', 'loc': ('body', 'model'), 'msg': 'Input should be a valid string', 'input': ['QwQ-32B']}, {'type': 'float_type', 'loc': ('body', 'temperature'), 'msg': 'Input should be a valid number', 'input': [0.2]}]", 'type': 'BadRequestError', 'param': None, 'code': 400}
It looks like you're using a Qwen model with Graphiti and the model is not generating JSON output in the format Graphiti requires. ~Please would you try with a frontier model from OpenAI or Google and let us know if this issue persists.~
Update: I just noticed you're using the OpenAIClient. Please use the OpenAIGenericClient when not using the OpenAI API. The OpenAIGenericClient adds the expected schema to the prompt, while this isn't needed when using OpenAI Structured Output.
OpenAIGenericClient is not exported
https://github.com/getzep/graphiti/blob/main/graphiti_core/llm_client/init.py#L6
I'm getting this error
ImportError: cannot import name 'OpenAIGenericClient' from 'graphiti_core.llm_client' (/home/coder/graphiti/graphiti_core/llm_client/__init__.py)