client icon indicating copy to clipboard operation
client copied to clipboard

[Bug]: local.ERROR: 'json_object' is not of type 'object' - 'response_format' {"exception":"[object] (OpenAI\Exceptions\ErrorException(code: 0): 'json_object' is not of type 'object' - 'response_format' at /app/vendor/openai-php/client/src/Transporters/HttpTransporter.php:131)

Open aaftre opened this issue 10 months ago • 6 comments

Description

local.ERROR: 'json_object' is not of type 'object' - 'response_format' {"exception":"[object] (OpenAI\Exceptions\ErrorException(code: 0): 'json_object' is not of type 'object' - 'response_format' at /app/vendor/openai-php/client/src/Transporters/HttpTransporter.php:131)

Steps To Reproduce

Works after commenting out response_format line;

            $response = $openai_client->chat()->create([
                'model' => "gpt-3.5-turbo-0125",
                'response_format' => 'json_object',
                'messages' => [
                    [
                        'role' => 'system',
                        'content' => $system_content
                    ],
                    [
                        'role' => 'user', 
                        'content' => $user_content
                    ]
                ],
            ]);

Same setup works in python

model="gpt-3.5-turbo-0125",
response_format={ "type": "json_object" },

OpenAI PHP Client Version

0.8.4

PHP Version

PHP 8.3.4

Notes

No response

aaftre avatar Apr 06 '24 04:04 aaftre

Just FYI, json_object is not supported for gtp-4 model anyway:

Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_object' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}

    response = client.chat.completions.create(
        messages = [
            {
                "role": "user",
                "content": prompt,
            }
        ],
        model="gpt-4",
        response_format = {
            "type": "json_object",
        }
    )

horaceho avatar Apr 09 '24 03:04 horaceho

For gtp-4 model, the following prompt produces a JSON output:

    response = client.chat.completions.create(
        messages = [
            {
                "role": "user",
                "content": "What are the first five letters in English? Answer in JSON format",
            }
        ],
        model="gpt-4",
    )
% python test.py
What are the first five letters in English? Answer in JSON format
{
"1": "A",
"2": "B",
"3": "C",
"4": "D",
"5": "E"
}

horaceho avatar Apr 09 '24 03:04 horaceho

Just FYI, json_object is not supported for gtp-4 model anyway:

This is not true:

To prevent these errors and improve model performance, when calling gpt-4-turbo-preview or gpt-3.5-turbo-0125, you can set response_format to { "type": "json_object" } to enable JSON mode. When JSON mode is enabled, the model is constrained to only generate strings that parse into valid JSON object. https://platform.openai.com/docs/guides/text-generation/json-mode

RHosono avatar Apr 13 '24 14:04 RHosono

@RHosono Did you run the code with different model(s) or just quoted the doc? The error message was return from OpenAI API. The model used was model="gpt-4".

horaceho avatar Apr 15 '24 03:04 horaceho

Just FYI, json_object is not supported for gtp-4 model anyway:

Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_object' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}

    response = client.chat.completions.create(
        messages = [
            {
                "role": "user",
                "content": prompt,
            }
        ],
        model="gpt-4",
        response_format = {
            "type": "json_object",
        }
    )

not sure why gpt4 is being mentioned, the model used in the original post was gpt-3.5-turbo-0125. this is the exact model used in openai documentation: https://platform.openai.com/docs/guides/text-generation/json-mode

this works just by using a guzzle http client:

$response = $client->request('POST', 'https://api.openai.com/v1/chat/completions', [
     'headers' => [
         'Content-Type' => 'application/json',
         'Authorization' => 'Bearer ' . getenv('OPENAI_API_KEY')
     ],
     'json' => [
         "messages" => [
             ["role" => "system", "content" => $system_content ],
             ["role" => "user", "content" => $address_json ]
         ],
         "model" => "gpt-3.5-turbo",
         "response_format" => [ "type" => "json_object" ]
     ]
 ]);

aaftre avatar Apr 15 '24 16:04 aaftre

@RHosono Did you run the code with different model(s) or just quoted the doc? The error message was return from OpenAI API. The model used was model="gpt-4".

So you mean the very old gpt-4 model, yes, that is correct. But both the new gpt-4-turbo and the older gpt-4-turbo-preview support it. I do not think anyone is using the more expensive and older gpt-4 modal. I was just making a general point, not about the code.

Because you said:

"json_object is not supported for gtp-4 model anyway:"

RHosono avatar Apr 17 '24 12:04 RHosono

Hi @aaftre

This does not look like an error related to the package.

If you do not agree, feel free to reopen the issue.

gehrisandro avatar May 28 '24 21:05 gehrisandro