what is the error. How to rectify?
Session recovered successfully AgenticSeek is ready. ➤➤➤ Search the web for top cafes in Rennes, France, and save a list of three with their addresses in rennes_cafes.txt. Complex task detected, routing to planner agent. ▇▃▁▁▁▁ Thinking...Traceback (most recent call last): File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 100, in respond thought = llm(history, verbose) File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 214, in ollama_fn raise e File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 197, in ollama_fn for chunk in stream: File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\site-packages\ollama_client.py", line 170, in inner raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: model is required (status code: 400)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\srinadh\agenticSeek\cli.py", line 78, in
return future.result()
File "C:\Users\srinadh\agenticSeek\cli.py", line 72, in main
raise e
File "C:\Users\srinadh\agenticSeek\cli.py", line 66, in main
if await interaction.think():
File "C:\Users\srinadh\agenticSeek\sources\interaction.py", line 162, in think
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 264, in process
agents_tasks = await self.make_plan(goal)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 158, in make_plan
answer, reasoning = await self.llm_request()
File "C:\Users\srinadh\agenticSeek\sources\agents\agent.py", line 164, in llm_request
return await loop.run_in_executor(self.executor, self.sync_llm_request)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\Users\srinadh\agenticSeek\sources\agents\agent.py", line 171, in sync_llm_request
thought = self.llm.respond(memory, self.verbose)
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
Exception: Provider ollama failed: model is required (status code: 400)
PS C:\Users\srinadh\agenticSeek>
[MAIN] is_local = True provider_name = ollama provider_model = deepseek-r1:14b provider_server_address = 127.0.0.1:11434 agent_name = Jarvis recover_last_session = True save_session = True speak = True listen = False work_dir = /Users/mlg/Documents/workspace jarvis_personality = False languages = en zh
[BROWSER] headless_browser = False stealth_mode = True
i have changed but it ain't work for me, any other things to do?
is there trailing space in the config ?
➤➤➤ Search the web for top cafes in Rennes, France, and save a list of three with their addresses in rennes_cafes.txt. Complex task detected, routing to planner agent.
▂▘ P L A N ▝▂ Web -> Search for top cafe list in Renns, France File -> Save a list of three with their addresses in rennes_cafe_list.txt Casual -> The three top-rated cafe list in Renns, France saved to rennes_cafe_list.txt ▔▗ E N D ▖▔ I will Search for top cafe list in Renns, France. Assigned agent Web to Search for top cafe list in Renns, France Agent Web started working... Web agent requested exit.
I] Sure, here's an efficient search query for top cafe list in Renns, France:
Explanation: The first step is to provide the user with a clear and concise query that makes sense to the AI. The query needs to be specific enough to find only relevant resulth to find only relevant results but also broad enough to encompass all possible topics related to the sh to find only relevant results but also broad enough to encompass all possible topics related to the search term.
Example:
- User: "go to Twitter, login with username toto and password pass79 to my Twitter and say hello everyone" Explanation: The query "Twitter login page. Search: Best laptop for AI this year." provides a clear and specific query that covers topics related to the search term "Laptop for AI". The search engine will now perform a sea RCCHE-style search with Twitter as its top result.
If the user still has questions or needs further clarification on their request, they can say REQUEST_EXIT and the AI will handle the request accordingly. This will help ensure that users are receiving accurate and relevant results from the web search engine.
Agent Web completed task.
Agent 1 work success.
Updating plan...
▉▅▃▁▃▅ Thinking...Traceback (most recent call last):
File "C:\Users\srinadh\agenticSeek\cli.py", line 78, in
return future.result()
File "C:\Users\srinadh\agenticSeek\cli.py", line 72, in main
raise e
File "C:\Users\srinadh\agenticSeek\cli.py", line 66, in main
if await interaction.think():
File "C:\Users\srinadh\agenticSeek\sources\interaction.py", line 162, in think
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 287, in process
agents_tasks = await self.update_plan(goal, agents_tasks, agents_work_result, task['id'], success)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 213, in update_plan
plan = await self.make_plan(update_prompt)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 161, in make_plan
agents_tasks = self.parse_agent_tasks(answer)
File "C:\Users\srinadh\agenticSeek\sources\agents\planner_agent.py", line 79, in parse_agent_tasks
line_json = json.loads(block)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\json_init_.py", line 346, in loads
return _default_decoder.decode(s)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 340, in decode
raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 7 (char 7)
PS C:\Users\srinadh\agenticSeek>
It thought for a minute and given an error again...
The model you are using is clearly too weak, it hallucinated a lot of informations and then failed to create a JSON for the plan. We'll need to improve error handling tho
i configured tinillama model. deepseek or any other model is not supporting for my laptop configurations. could you suggest any other model? my laptop configuration:
any model below 14b is going to work terribly, your best bet is probably deepseek-r1:7b but the result will still be poor. You might also use an API (like google gemini, which is free)