openai-python icon indicating copy to clipboard operation
openai-python copied to clipboard

Assistants API: Unexpected `tool_call` type in `on_tool_call_created`

Open ekassos opened this issue 9 months ago • 2 comments

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • [x] This is an issue with the Python library

Describe the bug

When calling our implementation of method on_tool_call_created of AsyncAssistantEventHandler, tool_call is of type Dict and not of type ToolCall. In our testing, this issue occurs only when the tool call created is File Search and not Code Interpreter (we do not use Function Calling). The issue presents itself only when using AsyncOpenAI and not AsyncAzureOpenAI.

See definition of on_tool_call_created: https://github.com/openai/openai-python/blob/d6bb8c14e66605ad2b7ed7bd62951014cd21b576/src/openai/lib/streaming/_assistants.py#L186-L187

To Reproduce

  1. Implement on_tool_call_created as
async def on_tool_call_created(self, tool_call) -> None:
    self.enqueue(
        {
            "type": "tool_call_created",
            "tool_call": tool_call.model_dump(),
        }
    )
  1. When a File Search tool call is created, on_tool_call_created will fail with error 'dict' object has no attribute 'model_dump'.

Code snippets

class BufferedStreamHandler(openai.AsyncAssistantEventHandler):
    def __init__(self, file_names: dict[str, str], *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.__buffer = io.BytesIO()
        self.file_names = file_names

    def enqueue(self, data: Dict) -> None:
        self.__buffer.write(orjson.dumps(data))
        self.__buffer.write(b"\n")

    def flush(self) -> bytes:
        value = self.__buffer.getvalue()
        self.__buffer.truncate(0)
        self.__buffer.seek(0)
        return value

    async def on_image_file_done(self, image_file: ImageFile) -> None:
        self.enqueue(
            {
                "type": "image_file_done",
                "file_id": image_file.file_id,
            }
        )

    async def on_message_created(self, message) -> None:
        self.enqueue(
            {
                "type": "message_created",
                "role": "assistant",
                "message": message.model_dump(),
            }
        )

    async def on_message_delta(self, delta, snapshot) -> None:
        self.enqueue(
            {
                "type": "message_delta",
                "delta": delta.model_dump(),
            }
        )

    async def on_tool_call_created(self, tool_call) -> None:
        self.enqueue(
            {
                "type": "tool_call_created",
                "tool_call": tool_call.model_dump(),
            }
        )

    async def on_tool_call_delta(self, delta, snapshot) -> None:
        self.enqueue(
            {
                "type": "tool_call_delta",
                "delta": delta.model_dump(),
            }
        )

    async def on_timeout(self) -> None:
        self.enqueue(
            {
                "type": "error",
                "detail": "Stream timed out waiting for response",
            }
        )

    async def on_done(self, run) -> None:
        self.enqueue({"type": "done"})

    async def on_exception(self, exception) -> None:
        self.enqueue(
            {
                "type": "error",
                "detail": str(exception),
            }
        )

OS

macOS

Python version

Python v3.11.10

Library version

openai v1.65.3

ekassos avatar Mar 05 '25 15:03 ekassos

Found the problem, had to set a custom query param, you can close this issue

gmirc12 avatar Mar 05 '25 16:03 gmirc12

Found the problem, had to set a custom query param, you can close this issue

@gmirc12 I think you’re referencing another issue: #2161, this issue remains unresolved.

ekassos avatar Mar 05 '25 16:03 ekassos