quarkus-langchain4j icon indicating copy to clipboard operation
quarkus-langchain4j copied to clipboard

OpenAI max_token is deprecated

Open cescoffier opened this issue 5 months ago • 3 comments

max_tokens (Deprecated):  The maximum number of [tokens](https://platform.openai.com/tokenizer) that can be generated in the chat completion. This value can be used to control [costs](https://openai.com/api/pricing/) for text generated via API.

This value is now deprecated in favor of max_completion_tokens, and is not compatible with [o1 series models](https://platform.openai.com/docs/guides/reasoning).

We would need to switch to the new parameter only if the provider is OpenAI.

cescoffier avatar Sep 16 '24 07:09 cescoffier