openai-python icon indicating copy to clipboard operation
openai-python copied to clipboard

Why doesn't the OpenAI API throw an error when we pass a value to `frequency_penalty` outside its valid range?

Open nbro10 opened this issue 1 year ago • 1 comments

Describe the bug

The documentation says

Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.

However, I can pass values like 2.3 to this parameter and the API doesn't raise anything. So, I suppose this is a bug with the API or the documentation is wrong. Note: this doesn't happen with the presence_penalty parameter, which, if set to e.g. 2.2, makes the API returns an error openai.error.InvalidRequestError: 2.2 is greater than the maximum of 2 - 'presence_penalty'. Even if I set say frequency_penalty=10000, the api doesn't err. Similarly, the API doesn't err if we set this parameter to negative values like -1000. So, maybe this parameter is supposed to take any floating-point number and the documentation is wrong? Or maybe the API is buggy?

To Reproduce

You can make any call to the completions endpoint with any valid value for the other parameters and you will not get an error you're supposed to get

Code snippets

import openai
# Set your api key

completions = openai.Completion.create(model="text-davinci-003", 
                                       prompt="hello", 
                                       frequency_penalty=1000)
print(completions)

OS

macos monterey (12.5.1)

Python version

Python 3.8.13

Library version

0.27.0

nbro10 avatar Mar 08 '23 14:03 nbro10

Will investigate later this week.

logankilpatrick avatar Mar 13 '23 16:03 logankilpatrick

Thanks for reporting!

This sounds like an issue with the underlying OpenAI API and not the Python library, so I'm going to go ahead and close this issue.

If this is still an issue, would you mind reposting at community.openai.com?

rattrayalex avatar Dec 31 '23 00:12 rattrayalex

@rattrayalex Ok, but this should be fixed. There should be at least a warning in the package's documentation (that actually tells users what the actual behaviour is) or this should be handled at the library's level. Ignoring this issue, like you're doing, is just reckless. After almost a year, this is the best you could do, ignore.

nbro10 avatar Jan 03 '24 08:01 nbro10

Thank you for your feedback. The appropriate place to fix this is in the API, and I've bumped this to the API team for you. They do not monitor this repo.

rattrayalex avatar Jan 03 '24 13:01 rattrayalex

@rattrayalex Thanks. It would be great to get an update once this gets fixed.

nbro10 avatar Jan 03 '24 13:01 nbro10

Working on this!

logankilpatrick avatar Jan 04 '24 16:01 logankilpatrick

This issue was fixed (by @logankilpatrick) in January. Marking the issue was completed for posterity.

brianz-openai avatar Mar 29 '24 21:03 brianz-openai