Theodore Teach
Theodore Teach
>  thinkyou,I thought it was directly searching for matching, and in the end I found that it was not.
写成openai/deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set `FN_CALL=False` to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py",...
写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion =...
> > 写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run...
> > > > 写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384,...
> > > > 写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384,...
> > > > > > 写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py",...

> >  > > 你好这个可能和您LLM的rate limit有关,您可以设置`DEBUG=True MC_MODE=Fasle`来看到详细的消息记录来供我们debug。期待您的进一步feedback 这个结果测试起来跟FN_CALL=False有关,如果设置=False,base_url正常,设置成True,base——url就是openai的。我想请问下国内有什么模型可以复现readme里面的对话吗?我目前用国内阿里的灵积平台是不能的。
> > 灵 > > 请问您现在的问题是可以正常运行,但是结果不尽人意嘛?Agent的运行和base model的性能确实有关,您可以试下deepseek v3,deepseek r1,和qwen max这种大参数模型。Claude-3.5 和gpt-4o当然是首选。 Tell me what do you want to create with `Agent Chain`? (type "exit" to quit, press "Enter" to continue):...