`submit_tool_outputs` closes a thread instead of changing it run status
Confirm this is an issue with the Python library and not an underlying OpenAI API
- [X] This is an issue with the Python library
Describe the bug
When using the client.beta.threads.runs.submit_tool_outputs method in the Python SDK, the thread closes and exits instead of updating the run status.
To Reproduce
Steps to Reproduce Initialize a new run using client.beta.threads.runs.create_and_poll:
run = client.beta.threads.runs.create_and_poll(
thread_id=thread.id,
assistant_id=business.assistant_id,
)
Submit tool outputs using client.beta.threads.runs.submit_tool_outputs:
run = client.beta.threads.runs.submit_tool_outputs(
thread_id=thread.id,
run_id=run.id,
tool_outputs=[{"tool_call_id": tool_call.id, "output": output}],
)
Observe that the thread closes and exits instead of updating the run status.
Code snippets
No response
OS
Ubuntu 22.04.4 LTS
Python version
Python v3.11.3
Library version
openai v1.31.1
Hi! I help work on the SDK and wanted to understand your issue better.
Can you provide a bit more information on what you mean by 'closes and exits instead of updating the run status.'? In particular, I am not sure what you mean by 'closes and exits'. The thread can reach a terminal state like 'completed' and there is also the state of the 'run' and 'run step'. There are a couple components where with a status/state so would be helpful if you could clarify which of these are changing (or not changing) in a way you think is unexpected.
You can see some more information on the lifecycle here: https://platform.openai.com/docs/assistants/how-it-works/runs-and-run-steps
@pstern-sl Thanks for replying to my question. I understand there wasn't a lot of information in my initial message. I've included the code in this reply for more context.
When I call the handle_incoming_message function, it initiates a thread that creates a message and then runs the create_and_poll function.
When run.status is requires_action, the submit_tool_outputs function executes correctly, and I get the new run. However, what happens after that is unclear to me. As soon as submit_tool_outputs executes, the run that started from handle_incoming_message exits. This means that the run returned by submit_tool_outputs never reaches run.status == "completed". As a result, my user doesn't receive the booking confirmation message.
Here is the log that I got from the huey scheduler.
(venv) simon@VDL217:~/Workspace/getappointment$ python manage.py run_huey
[2024-06-14 04:16:42,996] INFO:huey.consumer:MainThread:Huey consumer started with 1 thread, PID 1203714 at 2024-06-14 04:16:42.996742
[2024-06-14 04:16:42,996] INFO:huey.consumer:MainThread:Scheduler runs every 1 second(s).
[2024-06-14 04:16:42,996] INFO:huey.consumer:MainThread:Periodic tasks are enabled.
[2024-06-14 04:16:42,996] INFO:huey.consumer:MainThread:The following commands are available:
+ getappointment.core.tasks.send_whatsapp
+ getappointment.core.tasks.assistant_create
+ getappointment.core.tasks.assistant_update
+ getappointment.core.tasks.handle_incoming_message
[2024-06-14 04:17:38,858] INFO:huey:Worker-1:Added task a71c65a9-52f2-43b1-a200-171d0880867e to schedule, eta 2024-06-14 04:17:40.856764
[2024-06-14 04:17:41,001] INFO:huey:Worker-1:Executing getappointment.core.tasks.handle_incoming_message: a71c65a9-52f2-43b1-a200-171d0880867e @2024-06-14 04:17:40.856764
completed
[2024-06-14 04:17:45,224] INFO:huey:Worker-1:getappointment.core.tasks.handle_incoming_message: a71c65a9-52f2-43b1-a200-171d0880867e @2024-06-14 04:17:40.856764 executed in 4.222s
[2024-06-14 04:17:45,225] INFO:huey:Worker-1:Executing getappointment.core.tasks.send_whatsapp: a3a55b33-4c54-40a9-903a-7301ea9571a5
[2024-06-14 04:17:48,503] INFO:huey:Worker-1:getappointment.core.tasks.send_whatsapp: a3a55b33-4c54-40a9-903a-7301ea9571a5 executed in 3.278s
[2024-06-14 04:19:00,107] INFO:huey:Worker-1:Added task 6e2e2c7c-2c28-42c3-8641-4158ea2f6f3b to schedule, eta 2024-06-14 04:19:02.106976
[2024-06-14 04:19:03,001] INFO:huey:Worker-1:Executing getappointment.core.tasks.handle_incoming_message: 6e2e2c7c-2c28-42c3-8641-4158ea2f6f3b @2024-06-14 04:19:02.106976
completed
[2024-06-14 04:19:07,215] INFO:huey:Worker-1:getappointment.core.tasks.handle_incoming_message: 6e2e2c7c-2c28-42c3-8641-4158ea2f6f3b @2024-06-14 04:19:02.106976 executed in 4.214s
[2024-06-14 04:19:07,217] INFO:huey:Worker-1:Executing getappointment.core.tasks.send_whatsapp: 3b07479d-ddf5-4a9a-8c2c-77c8aa58b046
[2024-06-14 04:19:08,215] INFO:huey:Worker-1:getappointment.core.tasks.send_whatsapp: 3b07479d-ddf5-4a9a-8c2c-77c8aa58b046 executed in 0.999s
[2024-06-14 04:19:33,068] INFO:huey:Worker-1:Added task 5200dd10-efaf-4117-9b9e-148a97cc68e8 to schedule, eta 2024-06-14 04:19:35.067797
[2024-06-14 04:19:36,001] INFO:huey:Worker-1:Executing getappointment.core.tasks.handle_incoming_message: 5200dd10-efaf-4117-9b9e-148a97cc68e8 @2024-06-14 04:19:35.067797
requires_action
book_appointment
queued
run queued status will be changed to in-progress immediately.
[2024-06-14 04:19:39,821] INFO:huey:Worker-1:getappointment.core.tasks.handle_incoming_message: 5200dd10-efaf-4117-9b9e-148a97cc68e8 @2024-06-14 04:19:35.067797 executed in 3.819s
As you can see each when run.status is completed user receive a message from the bot. I also attached whatsapp conversation for your reference. When the book_appointment get called the run.status changed to queued but then it exists the thread instead of changing the status to completed.
Do you think this is the problem with huey scheduler? I choose huey because it's lightweight. Do you think I can try this with celery?
@rajasimon -- did you find a solution for this? I think I'm seeing the same issue, and what I've noticed is that I can swap the model to gpt-4o-mini, and it works using client.beta.threads.runs.submit_tool_outputs_and_poll. However, with other gpt models, it does not.
facing the same issue here
submitting the tool outputs is not changing the run status
Thanks for reporting!
This sounds like an issue with the underlying OpenAI API and not the SDK, so I'm going to go ahead and close this issue.
Would you mind reposting at community.openai.com?