client icon indicating copy to clipboard operation
client copied to clipboard

[Bug]: $acceptedPredictionTokens must be of type int, null given

Open fgrueninger opened this issue 10 months ago • 4 comments

Description

Hi, I'm using v0.10.3 with Azure and with GPT-4o, all was working fine.

After switching to o1-mini, I'm getting this error:

PHP Fatal error: Uncaught TypeError: OpenAI\\Responses\\Chat\\CreateResponseUsageCompletionTokensDetails::__construct(): Argument #3 ($acceptedPredictionTokens) must be of type int, null given, called in /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponseUsageCompletionTokensDetails.php on line 21 and defined in /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponseUsageCompletionTokensDetails.php:9\nStack trace:\n#0 /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponseUsageCompletionTokensDetails.php(21): OpenAI\\Responses\\Chat\\CreateResponseUsageCompletionTokensDetails->__construct()\n#1 /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponseUsage.php(27): OpenAI\\Responses\\Chat\\CreateResponseUsageCompletionTokensDetails::from()\n#2 /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponse.php(59): OpenAI\\Responses\\Chat\\CreateResponseUsage::from()\n#3 /var/www/ai.local/vendor/openai-php/client/src/Resources/Chat.php(35): OpenAI\\Responses\\Chat\\CreateResponse::from()\n#4 /var/www/ai.local/index.php(53): OpenAI\\Resources\\Chat->create()\n#5 {main}\n thrown in /var/www/ai.local/vendor/openai-php/client/src/Responses/Chat/CreateResponseUsageCompletionTokensDetails.php on line 9, referer: https://ai.local/

Somebody knows what this could be caused by, or how I could work around this? Thank you :)

Steps To Reproduce

Try to use o1-mini on Azure, with any message.

OpenAI PHP Client Version

v0.10.3

PHP Version

8.2.26

Notes

No response

fgrueninger avatar Feb 12 '25 20:02 fgrueninger

I have same issue. Many you shoud use createStreamed.

zhaozhenxiang avatar Feb 13 '25 03:02 zhaozhenxiang

I have same issue. Many you shoud use createStreamed.

I think openai modify HTTP Response struct. Before with: accepted_prediction_tokens Now without: accepted_prediction_tokens

zhaozhenxiang avatar Feb 13 '25 03:02 zhaozhenxiang

the same issue

weijunken avatar Apr 17 '25 06:04 weijunken

Anyone have a raw payload from the service? A few fixes all trying to fix this a different way and I just want to see a raw sample to identify the source issue myself.

iBotPeaches avatar Apr 17 '25 15:04 iBotPeaches

Looks like I forgot to close this after it was fixed in: https://github.com/openai-php/client/pull/560

iBotPeaches avatar May 27 '25 11:05 iBotPeaches