web-ui icon indicating copy to clipboard operation
web-ui copied to clipboard

src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list

Open UsFaker opened this issue 9 months ago โ€ข 8 comments

I got an error using the native model deepseek r1 awq model. src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list. I checked my model logs. The return parameter is not list. Why is this error happening?

INFO [agent] ๐Ÿš€ Starting task: ๆ‰“ๅผ€็™พๅบฆๆœ็ดขไปŠๅคฉ็š„้‡‘ไปท INFO [src.agent.custom_agent] ๐Ÿ“ Step 1 ERROR [agent] โŒ Result failed 1/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list INFO [src.agent.custom_agent] ๐Ÿ“ Step 1 ERROR [agent] โŒ Result failed 2/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list INFO [src.agent.custom_agent] ๐Ÿ“ Step 1 ERROR [agent] โŒ Result failed 3/3 times: src.agent.custom_views.CustomAgentOutput() argument after ** must be a mapping, not list ERROR [src.agent.custom_agent] โŒ Stopping due to 3 consecutive failures INFO [agent] Created GIF at agent_history.gif `

dplog.txt

`

UsFaker avatar Mar 24 '25 10:03 UsFaker

which model?

warmshao avatar Mar 24 '25 22:03 warmshao

which model? The model is DeepSeek-R1-AWQ

The startup parameters are as follows:

Image Image

UsFaker avatar Mar 25 '25 01:03 UsFaker

Use ollama, you have incorrectly chosen "openai" as LLM provider

evcharger avatar Mar 25 '25 07:03 evcharger

Use ollama, you have incorrectly chosen "openai" as LLM provider

When I choose ollama. The request has been redirected. I set the address is http://192.168.1.2:8000/v1/chat/completions. VLLM server's actual address is http://192.168.1.2:8000/v1/chat/completions/api/chat Why does this happen?

Image

Image

UsFaker avatar Mar 25 '25 08:03 UsFaker

Is it because the framework now supports only ollama and can't use vllm?

UsFaker avatar Mar 25 '25 11:03 UsFaker

ๆˆ‘ไนŸๆœ‰ๅŒๆ ท็š„้—ฎ้ข˜ ๏ผŒ่ฏท้—ฎ่งฃๅ†ณไบ†ๅ—

jinlong2101 avatar Apr 10 '25 03:04 jinlong2101

Use ollama, you have incorrectly chosen "openai" as LLM provider

When I choose ollama. The request has been redirected. I set the address is http://192.168.1.2:8000/v1/chat/completions. VLLM server's actual address is http://192.168.1.2:8000/v1/chat/completions/api/chat Why does this happen?

Image

Image

Shouldn't the Base URL be just http://192.168.2.2:8000/? And because Ollama is selected /v1/ is not required unless you choose openai?

Terramoto avatar Apr 16 '25 08:04 Terramoto

try the new code

warmshao avatar May 11 '25 03:05 warmshao