semantic-kernel
semantic-kernel copied to clipboard
Important Error Message from Azure OpenAI service is ignored
Describe the bug In OpenAIClientAbstract.ExecutePostRequestAsync, when I sent a large prompt, there is error message in response:
{ "error": { "message": "This model's maximum context length is 4097 tokens, however you requested 7326 tokens (3230 in your prompt; 4096 for the completion). Please reduce your prompt; or completion length.", "type": "invalid_request_error", "param": null, "code": null } }
In the ExecutePostRequestAsync, you throw new AIException(AIException.ErrorCodes.InvalidRequest,$"The request is not valid, HTTP status: {response.StatusCode:G}"). The important information from error.message is missed. I can only get this message when I debug into the code.
Actually the error message should be post to outer scope.
To Reproduce Steps to reproduce the behavior:
- Give a very large ASK during your sample code.
Expected behavior the error message from OpenAI serviceshould be post to outer scope, otherwise users will never be able to find the root cause.
Screenshots N/A
Desktop (please complete the following information):
- OS: Windows
- IDE: Visual Studio
- NuGet Package Version: 0.8.11.1-preview
Additional context N/A
This is a great callout! Thanks for bringing this up. We'll add a backlog item to get this fixed!
100%, I had a similar thought when I saw the issues in the PR to bring in sample 4. Error messages are great but still too opaque
@masonzhang , we are tracking this with the project team to add better error handling.