litaolaile
litaolaile
**例行检查** [//]: # (方框内删除已有的空格,填 x 号) + [ ] 我已确认目前没有类似 issue + [ ] 我已确认我已升级到最新版本 + [ ] 我已完整查看过项目 README,尤其是常见问题部分 + [ ] 我理解并愿意跟进此 issue,协助测试和提供反馈 + [ ] 我理解并认可上述内容,并理解项目维护者精力有限,**不遵循规则的 issue...

是否支持脱离github的私有代码库
2025-04-07 16:25:08 - graphiti_core.llm_client.openai_client - ERROR - Error in generating LLM response: Error code: 400 - {'object': 'error', 'message': "[{'type': 'string_type', 'loc': ('body', 'model'), 'msg': 'Input should be a valid...
### Description running client but failed,my server is no problen,ican confirm in explore  ----------client.py------- [import asyncio from fastmcp import Client from fastmcp.client.transports import SSETransport sse_url = "http://localhost:8000/sse" # Option...
### Description i use the demo in sdk,but when execute client.py,it happens error, the server i confirm it runs normally,and i check it in chrome like this  but when...
### 🐛 Describe the bug i want to know if memai support chatmodel and embding model in differernt url,how can i use ,give a clear example