web_surfer can't get response with "arguments"
the web_surfer's get_llm_response()
return
and I print the following error message
How can i solve this ? Is local llm problem ? my local llm is openai compatible but not openai etc.
Hi @oras903! Have you done any modifications to the websurfer code before getting this error? Do you mind showing the full trace? Does your LLM support tool calling in the OpenAI format ?
Hi @oras903! Have you done any modifications to the websurfer code before getting this error? Do you mind showing the full trace? Does your LLM support tool calling in the OpenAI format ?
my ai fellow tell me the local llm support tool calling in the openai format
when i use openai llm , the system work well , but when i use local llm , i find web_surfer can't work , so I just add print() and catch exception to trace the output of some step
which model are you using?
which model are you using?
the web_surfer
Hi @oras903, I have recently been playing around with local models as well and have seen similar issues. The problem is that local models tend to be smaller and less powerful since they have to fit on your personal computer.
A lot of local small models do not support tool calling or vision capabilities, which will break the web surfer functionalities, which is why you saw that magentic-ui works with OpenAI but with your current model.
Following up on Hussein's question, what model (not agent) are you using? You can find this information in your config.yaml file, it may be something like Qwen, Llama, Deepseek, etc