src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list
I got an error using the native model deepseek r1 awq model. src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list. I checked my model logs. The return parameter is not list. Why is this error happening?
INFO [agent] ๐ Starting task: ๆๅผ็พๅบฆๆ็ดขไปๅคฉ็้ไปท INFO [src.agent.custom_agent] ๐ Step 1 ERROR [agent] โ Result failed 1/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list INFO [src.agent.custom_agent] ๐ Step 1 ERROR [agent] โ Result failed 2/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list INFO [src.agent.custom_agent] ๐ Step 1 ERROR [agent] โ Result failed 3/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list ERROR [src.agent.custom_agent] โ Stopping due to 3 consecutive failures INFO [agent] Created GIF at agent_history.gif `
`
which model?
Use ollama, you have incorrectly chosen "openai" as LLM provider
Use ollama, you have incorrectly chosen "openai" as LLM provider
When I choose ollama. The request has been redirected. I set the address is http://192.168.1.2:8000/v1/chat/completions. VLLM server's actual address is http://192.168.1.2:8000/v1/chat/completions/api/chat Why does this happen?
Is it because the framework now supports only ollama and can't use vllm?
ๆไนๆๅๆ ท็้ฎ้ข ๏ผ่ฏท้ฎ่งฃๅณไบๅ
Use ollama, you have incorrectly chosen "openai" as LLM provider
When I choose ollama. The request has been redirected. I set the address is http://192.168.1.2:8000/v1/chat/completions. VLLM server's actual address is http://192.168.1.2:8000/v1/chat/completions/api/chat Why does this happen?
Shouldn't the Base URL be just http://192.168.2.2:8000/? And because Ollama is selected /v1/ is not required unless you choose openai?
try the new code