kernel-memory icon indicating copy to clipboard operation
kernel-memory copied to clipboard

[Bug] Can't use gpt-o1 because it requires MaxCompletionToken

Open piffy76 opened this issue 8 months ago • 0 comments

Context / Scenario

If I try to use Kernel Memory with gpt-o1, the text generation throws an exception.

HTTP 400 (invalid_request_error: unsupported_parameter) Parameter: max_tokens

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

It is known that o1 requires max_completion_tokens instead of max-tokens. AzureOpenAIPromptExecutionSettings includes MaxCompletionTokens which I can use with Semantic Kernel ChatCompletion.

The issue is that Kernel Memory is using OpenAIPromptExecutionSettings in its text generation in OpenAITextGenerator.cs, line 132.

        var skOptions = new OpenAIPromptExecutionSettings
        {
            MaxTokens = options.MaxTokens,
            Temperature = options.Temperature,
            FrequencyPenalty = options.FrequencyPenalty,
            PresencePenalty = options.PresencePenalty,
            TopP = options.NucleusSampling
        };

Also, o1, does not accept Temperature, so if it is set, it will throw an exception.

The code will need to be updated to use AzureOpenAIPromptExecutionSettings and offer an option to set each property based on the need.

What happened?

I expect to be able to use gpt-o1 without the text generation throwing an exception.

Importance

I cannot use Kernel Memory

Platform, Language, Versions

C#, 0.98.250324.1, Semantic Kernel in Visual Studio, .Net 8

Relevant log output

HTTP 400 (invalid_request_error: unsupported_parameter)
Parameter: max_tokens

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

piffy76 avatar Apr 30 '25 20:04 piffy76