请问如何配置本地部署的大模型来使用呢?
Feature description
Your Feature
参考文档中的【ollama-api】部分 https://docs.deepwisdom.ai/main/en/guide/get_started/configuration/llm_api_configuration.html#ollama-api
Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 123.000(s), this was the 6th time calling it. exp: RetryError[<Future at 0x1cfc23fd350 state=finished raised JSONDecodeError>] 本地模型调用会报这个错误
@liuxymm Do you still have the problem, what's your content of config2.yaml?
@liuxymm The JSONDecodeError usually dues to the small size llm json output following ability(it can't output complete json). Maybe you should try a larger model.