extended_openai_conversation icon indicating copy to clipboard operation
extended_openai_conversation copied to clipboard

Problem with intent hitting multiple entities

Open richardsorensson opened this issue 1 year ago • 5 comments

I'm running into a response "Unexpected error during intent recognition" when running wider questions such as "turn of all lights", while a more narrow line as "turn of all office lights" does work.

I've not done any wider troubleshooting yet, but i've started with removing any groups (light / cover) to assure no custom groupings are causing the issues.

Any one running into the same issue and found a solution?

Here's the RAW log details.

Logger: homeassistant.components.assist_pipeline.pipeline Source: components/assist_pipeline/pipeline.py:938 Integration: Assist pipeline (documentation, issues) First occurred: 16:50:11 (2 occurrences) Last logged: 16:53:23

Unexpected error during intent recognition Traceback (most recent call last): File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 938, in recognize_intent conversation_result = await conversation.async_converse( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/homeassistant/homeassistant/components/conversation/init.py", line 467, in async_converse result = await agent.async_process( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 157, in async_process response = await self.query(user_input, messages, exposed_entities, 0) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 280, in query message = await self.execute_function_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 319, in execute_function arguments = json.loads(message["function_call"]["arguments"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) ^^^^^^^^^^^^^^^^^^^^^^ json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 26 column 6 (char 482)

richardsorensson avatar Nov 27 '23 15:11 richardsorensson

You can try increasing response token. This happens because response json is truncated by limit of 150 tokens by default.

Keep in mind that more tokens cost more.

jekalmin avatar Nov 27 '23 16:11 jekalmin

That did the trick, thanks alot!

richardsorensson avatar Nov 27 '23 20:11 richardsorensson

Did some further testing, with maxed token 4096.

It does not manage to change state of some devices, but while waiting for the chat response "..." after roughly 20-30 seconds, the respons is "Unexpected error during intent recognition", but the entities states are changed first (after just a second or two).

error below

Logger: homeassistant.components.assist_pipeline.pipeline Source: components/assist_pipeline/pipeline.py:938 Integration: Assist pipeline (documentation, issues) First occurred: 22:38:15 (1 occurrences) Last logged: 22:38:15

Unexpected error during intent recognition Traceback (most recent call last): File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 938, in recognize_intent conversation_result = await conversation.async_converse( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/homeassistant/homeassistant/components/conversation/init.py", line 467, in async_converse result = await agent.async_process( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 157, in async_process response = await self.query(user_input, messages, exposed_entities, 0) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 280, in query message = await self.execute_function_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 332, in execute_function return await self.query(user_input, messages, exposed_entities, n_requests) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 280, in query message = await self.execute_function_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 332, in execute_function return await self.query(user_input, messages, exposed_entities, n_requests) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 280, in query message = await self.execute_function_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 332, in execute_function return await self.query(user_input, messages, exposed_entities, n_requests) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 280, in query message = await self.execute_function_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/config/custom_components/extended_openai_conversation/init.py", line 319, in execute_function arguments = json.loads(message["function_call"]["arguments"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) ^^^^^^^^^^^^^^^^^^^^^^ json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 3 (char 4)

richardsorensson avatar Nov 27 '23 21:11 richardsorensson

Should mention i ran with ggpt-4-1106-preview and changed to gpt-3.5-turbo-1106.

Now it works without errors again, and proper response.

richardsorensson avatar Nov 27 '23 21:11 richardsorensson

It seems like the same error (token limit) with above, but functions are called multiple times.

  • Add logging and see what functions with arguments are called.
  • If you want to limit the number of function calls, set Maximum function calls to small number such as 1. 1
  • The reason why many functions are called depends on how you asked and how model would behave. If you want to change how model behaves on a particular question, tell model to do so in prompt.

jekalmin avatar Nov 28 '23 08:11 jekalmin

Closing the issue. Feel free to reopen it.

jekalmin avatar Nov 12 '24 13:11 jekalmin