openai-cookbook icon indicating copy to clipboard operation
openai-cookbook copied to clipboard

TypeError in the `Using_GPT4_Vision_With_Function_Calling.ipynb` example

Open Saidiibrahim opened this issue 1 year ago • 3 comments
trafficstars

When I get to this section of the notebook:

from typing import Union

# extract the tool call from the response
ORDER_ID = "12345"  # Placeholder order ID for testing
INSTRUCTION_PROMPT = "You are a customer service assistant for a delivery service, equipped to analyze images of packages. If a package appears damaged in the image, automatically process a refund according to policy. If the package looks wet, initiate a replacement. If the package appears normal and not damaged, escalate to agent. For any other issues or unclear images, escalate to agent. You must always use tools!"

def delivery_exception_support_handler(test_image: str):
    payload = {
        "model": MODEL,
        "response_model": Iterable[RefundOrder | ReplaceOrder | EscalateToAgent],
        "tool_choice": "auto",  # automatically select the tool based on the context
        "temperature": 0.0,  # for less diversity in responses
        "seed": 123,  # Set a seed for reproducibility
    }
    payload["messages"] = [
        {
            "role": "user",
            "content": INSTRUCTION_PROMPT,
        },
        {
            "role": "user",
            "content": [
                {
                    "type": "image_url",
                    "image_url": {
                        "url": f"data:image/jpeg;base64,{image_data[test_image]}"
                    }
                },
            ],
        }
    ]
    function_calls = instructor.from_openai(
        OpenAI(), mode=instructor.Mode.PARALLEL_TOOLS
    ).chat.completions.create(**payload)
    for tool in function_calls:
        print(f"- Tool call: {tool.action} for provided img: {test_image}")
        print(f"- Parameters: {tool}")
        print(f">> Action result: {tool(ORDER_ID)}")
        return tool


print("Processing delivery exception support for different package images...")

print("\n===================== Simulating user message 1 =====================")
assert delivery_exception_support_handler("damaged_package").action == "refund_order"

print("\n===================== Simulating user message 2 =====================")
assert delivery_exception_support_handler("normal_package").action == "escalate_to_agent"

print("\n===================== Simulating user message 3 =====================")
assert delivery_exception_support_handler("wet_package").action == "replace_order"

I get this TypeError:

{
	"name": "TypeError",
	"message": "issubclass() arg 1 must be a class",
	"stack": "---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[14], line 45
     42 print(\"Processing delivery exception support for different package images...\")
     44 print(\"\
===================== Simulating user message 1 =====================\")
---> 45 assert delivery_exception_support_handler(\"damaged_package\").action == \"refund_order\"
     47 print(\"\
===================== Simulating user message 2 =====================\")
     48 assert delivery_exception_support_handler(\"normal_package\").action == \"escalate_to_agent\"

Cell In[14], line 32, in delivery_exception_support_handler(test_image)
      8 payload = {
      9     \"model\": MODEL,
     10     \"response_model\": Iterable[RefundOrder | ReplaceOrder | EscalateToAgent],
   (...)
     13     \"seed\": 123,  # Set a seed for reproducibility
     14 }
     15 payload[\"messages\"] = [
     16     {
     17         \"role\": \"user\",
   (...)
     30     }
     31 ]
---> 32 function_calls = instructor.from_openai(
     33     OpenAI(), mode=instructor.Mode.PARALLEL_TOOLS
     34 ).chat.completions.create(**payload)
     35 for tool in function_calls:
     36     print(f\"- Tool call: {tool.action} for provided img: {test_image}\")

File ~/Desktop/Builds/Python_Builds/openai-devday/.venv/lib/python3.12/site-packages/instructor/client.py:74, in Instructor.create(self, response_model, messages, max_retries, validation_context, **kwargs)
     64 def create(
     65     self,
     66     response_model: Type[T],
   (...)
     70     **kwargs,
     71 ) -> T:
     72     kwargs = self.handle_kwargs(kwargs)
---> 74     return self.create_fn(
     75         response_model=response_model,
     76         messages=messages,
     77         max_retries=max_retries,
     78         validation_context=validation_context,
     79         **kwargs,
     80     )

File ~/Desktop/Builds/Python_Builds/openai-devday/.venv/lib/python3.12/site-packages/instructor/patch.py:138, in patch.<locals>.new_create_sync(response_model, validation_context, max_retries, *args, **kwargs)
    130 @wraps(func)
    131 def new_create_sync(
    132     response_model: Type[T_Model] = None,
   (...)
    136     **kwargs: T_ParamSpec.kwargs,
    137 ) -> T_Model:
--> 138     response_model, new_kwargs = handle_response_model(
    139         response_model=response_model, mode=mode, **kwargs
    140     )
    141     response = retry_sync(
    142         func=func,
    143         response_model=response_model,
   (...)
    148         mode=mode,
    149     )
    150     return response

File ~/Desktop/Builds/Python_Builds/openai-devday/.venv/lib/python3.12/site-packages/instructor/process_response.py:204, in handle_response_model(response_model, mode, **kwargs)
    200 if mode == Mode.PARALLEL_TOOLS:
    201     assert (
    202         new_kwargs.get(\"stream\", False) is False
    203     ), \"stream=True is not supported when using PARALLEL_TOOLS mode\"
--> 204     new_kwargs[\"tools\"] = handle_parallel_model(response_model)
    205     new_kwargs[\"tool_choice\"] = \"auto\"
    207     # This is a special case for parallel models

File ~/Desktop/Builds/Python_Builds/openai-devday/.venv/lib/python3.12/site-packages/instructor/dsl/parallel.py:73, in handle_parallel_model(typehint)
     70 def handle_parallel_model(typehint: Type[Iterable[T]]) -> List[Dict[str, Any]]:
     71     the_types = get_types_array(typehint)
     72     return [
---> 73         {\"type\": \"function\", \"function\": openai_schema(model).openai_schema}
     74         for model in the_types
     75     ]

File ~/Desktop/Builds/Python_Builds/openai-devday/.venv/lib/python3.12/site-packages/instructor/function_calls.py:212, in openai_schema(cls)
    211 def openai_schema(cls: Type[BaseModel]) -> OpenAISchema:
--> 212     if not issubclass(cls, BaseModel):
    213         raise TypeError(\"Class must be a subclass of pydantic.BaseModel\")
    215     return wraps(cls, updated=())(
    216         create_model(
    217             cls.__name__ if hasattr(cls, \"__name__\") else str(cls),
    218             __base__=(cls, OpenAISchema),
    219         )
    220     )

File <frozen abc>:123, in __subclasscheck__(cls, subclass)

TypeError: issubclass() arg 1 must be a class"
}

What am I missing here?

Thanks.

Saidiibrahim avatar Apr 25 '24 23:04 Saidiibrahim

Hitting the same here. Will let you know if I find a fix.

jack4git avatar Apr 27 '24 13:04 jack4git

The second example works fine for me ...

jack4git avatar Apr 27 '24 13:04 jack4git

OK, though I don't fully understand why, these changes seem to fix it:

In the import cell, add: from typing import Union

In the final cell, change the response_model part of payload to be: "response_model": Iterable[Union[RefundOrder, ReplaceOrder, EscalateToAgent]],

jack4git avatar Apr 28 '24 13:04 jack4git

Interesting 🤔. That hasn't worked for me.

Saidiibrahim avatar Apr 28 '24 23:04 Saidiibrahim

Yeah, in Python 3.12 these two lines should be equivalent, so I suspect some other side effect of making that change is actually fixing things for me.

That said, when I went back to the original, it broke again, and when I made the changes again, it worked again ... so dunno ...

Does the second example work for you?

jack4git avatar Apr 29 '24 00:04 jack4git

Yeah that works for me now. Your suggested change above is correct. I just had to restart my Jupyter Kernel for the change to work. Thanks 🙏

Saidiibrahim avatar Apr 30 '24 09:04 Saidiibrahim