phidata icon indicating copy to clipboard operation
phidata copied to clipboard

OpenAILike LLM error out with 422

Open renkelvin opened this issue 3 months ago • 18 comments

Hi there, I'm exploring using phidata with a custom LLM, which is compatible with OpenAI.

Background

I've confirmed the BASE_URL and API_KEY work properly with curl command, and OpenAI python client.

client = openai.OpenAI(
    base_url=BASE_URL,
    api_key=API_KEY
)

completion = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)

Problem

However, when I try to plug it into phidata, neither the following 2 ways work for me.

OpenAILike

llm=OpenAILike(
    model="gpt-3.5-turbo",
    api_key= API_KEY,
    base_url= BASE_URL,
),

OpenAIChat with openai_client

client = openai.OpenAI(
    base_url=BASE_URL,
    api_key=API_KEY
)

assistant = Assistant(
...
    llm=OpenAIChat(
        model="gpt-3.5-turbo",
        base_url=BASE_URL,
        api_key=API_KEY,
    
        openai_client=openai_client,
    )
...
)

Error

Both of them return the same error:

UnprocessableEntityError: Error code: 422 - {'detail': [{'type': 'string_type', 'loc': ['body', 'tools', 0, 'function', 'parameters', 'properties', 'num_chats', 'type'], 'msg': 'Input should be a valid string', 'input': ['number', 'null'], 'url': 'https://errors.pydantic.dev/2.6/v/string_type'}]}
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
    exec(code, module.__dict__)
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 204, in <module>
    main()
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 120, in main
    for delta in coffee_assistant.run(question):
File "/usr/local/lib/python3.11/site-packages/phi/assistant/assistant.py", line 529, in _run
    for chunk in current_task.run(message=current_task_message, stream=True, **kwargs):
File "/usr/local/lib/python3.11/site-packages/phi/task/llm/llm_task.py", line 598, in _run
    for response_chunk in self.llm.response_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 436, in response_stream
    for response in self.invoke_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 174, in invoke_stream
    yield from self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 579, in create
    return self._post(
           ^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None

Seems the type of num_chats is not valid.

Could anyone help with this issue? Thanks!

renkelvin avatar May 02 '24 00:05 renkelvin

@renkelvin here to help, can you share the log when you run with debug_mode=True like Assistant(debug_mode=True)

This seems like a type error somewhere which i would like to fix asap :)

ashpreetbedi avatar May 02 '24 10:05 ashpreetbedi

Thanks @ashpreetbedi . I've set debug_mode=True while the output looks identical. Did I miss anything?

UnprocessableEntityError: Error code: 422 - {'detail': [{'type': 'string_type', 'loc': ['body', 'tools', 0, 'function', 'parameters', 'properties', 'num_chats', 'type'], 'msg': 'Input should be a valid string', 'input': ['number', 'null'], 'url': 'https://errors.pydantic.dev/2.6/v/string_type'}]}
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
    exec(code, module.__dict__)
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 199, in <module>
    main()
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 115, in main
    for delta in coffee_assistant.run(question):
File "/usr/local/lib/python3.11/site-packages/phi/assistant/assistant.py", line 529, in _run
    for chunk in current_task.run(message=current_task_message, stream=True, **kwargs):
File "/usr/local/lib/python3.11/site-packages/phi/task/llm/llm_task.py", line 598, in _run
    for response_chunk in self.llm.response_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 436, in response_stream
    for response in self.invoke_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 174, in invoke_stream
    yield from self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 579, in create
    return self._post(
           ^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None

renkelvin avatar May 02 '24 14:05 renkelvin

Hey @ashpreetbedi , any updates with this issue?

renkelvin avatar May 06 '24 16:05 renkelvin

sorry @renkelvin i missed this, looking now

ashpreetbedi avatar May 06 '24 16:05 ashpreetbedi

trying to find where we use num_chats

ashpreetbedi avatar May 06 '24 16:05 ashpreetbedi

Seems it's only used here: https://github.com/phidatahq/phidata/blob/1302da0de264e3419a2c40883ed8e53de8ed794b/phi/assistant/assistant.py#L1227

renkelvin avatar May 06 '24 23:05 renkelvin

Hi @ashpreetbedi , any updates to this issue? Or anything I can help with?

renkelvin avatar May 08 '24 16:05 renkelvin

@renkelvin tried everything but cant reproduce, we dont even use num_chats anywhere

ashpreetbedi avatar May 08 '24 16:05 ashpreetbedi

can you do a quick test with openai GPT4?

ashpreetbedi avatar May 08 '24 16:05 ashpreetbedi

this will help us isolate if its a model/api issue or phidata issue

ashpreetbedi avatar May 08 '24 16:05 ashpreetbedi

Sure, do you mean test OpenAILike with GPT-4?

renkelvin avatar May 08 '24 16:05 renkelvin

Yes please @renkelvin can you test just with OpenAIChat & OpenAILike

ashpreetbedi avatar May 08 '24 16:05 ashpreetbedi

OpenAIChat works fine with GPT-4.

        llm=OpenAIChat(
            model="gpt-4",
            max_tokens=ai_settings.default_max_tokens,
            temperature=ai_settings.default_temperature,
        ),

OpenAILike failed with 404, which is expected since there is no "gpt-4" model on the BASE_URL.

        llm=OpenAILike(
            model="gpt-4",
            api_key=API_KEY,
            base_url=BASE_URL,
        ),
NotFoundError: Error code: 404 - {'detail': 'Not Found'}
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
    exec(code, module.__dict__)
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 198, in <module>
    main()
File "/usr/local/app/app/pages/Coffee_Assistant.py", line 114, in main
    for delta in coffee_assistant.run(question):
File "/usr/local/lib/python3.11/site-packages/phi/assistant/assistant.py", line 529, in _run
    for chunk in current_task.run(message=current_task_message, stream=True, **kwargs):
File "/usr/local/lib/python3.11/site-packages/phi/task/llm/llm_task.py", line 598, in _run
    for response_chunk in self.llm.response_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 436, in response_stream
    for response in self.invoke_stream(messages=messages):
File "/usr/local/lib/python3.11/site-packages/phi/llm/openai/chat.py", line 174, in invoke_stream
    yield from self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 579, in create
    return self._post(
           ^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None

renkelvin avatar May 08 '24 17:05 renkelvin

Seems it's only used here:

https://github.com/phidatahq/phidata/blob/1302da0de264e3419a2c40883ed8e53de8ed794b/phi/assistant/assistant.py#L1227

btw, seems num_chats is used here.

renkelvin avatar May 08 '24 17:05 renkelvin

Hi @ashpreetbedi , I found out the issue should on the API side. The platform we use doesn't support ["number", "null"] as the function call parameter type. num_chats has a default value so it's converted to ["number", "null"] as part of the get_chat_history tool definition. I also tested that OpenAI support such parameter type so I think this is no-op on the Phidata side. Thanks!

renkelvin avatar May 09 '24 22:05 renkelvin

@renkelvin you can turn this function off :) currently you'll have read_chat_history=True but if you remove that this tool wont be added. let me know if this works and then we can create a custom tool for you to read the chat history then

ashpreetbedi avatar May 09 '24 22:05 ashpreetbedi

Thanks @ashpreetbedi , set use_tools=False solves the issue for me. I noticed that read_chat_history is overridden here(https://github.com/phidatahq/phidata/blob/0278000a9196e7e6771da22144aa9a2f3b5036ea/phi/assistant/assistant.py#L279) if use_tools is set to True. I wonder if it's intended?

renkelvin avatar May 10 '24 17:05 renkelvin