Support for Multi Model Detailed Response (Max Tokens, Prompt Tokens, etc)
Motivation and Context
Common feature request and missing capability to acquire the actual details from the models we are integrating.
Resolves #802
Resolves #618
Resolves Partially #351
Getting the results alternative to #693
Description
This change introduces a new Property in object? SKContext.LastPromptResults which will contain a list of each result detail for a given prompt.
The Connector namespace will also provide a Extension methods that converts this object in the actual class that you will be using to serialize or get the detailed information about the Model result.
Normal suggested usage
var textResult = await excuseFunction.InvokeAsync("I missed the F1 final race");
Getting model result as json:
var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);
Getting model result as a traversable object (using a connector extension):
var modelResultJson = textResult.GetOpenAILastPromptResult()?.Usage.TotalTokens;
Contribution Checklist
- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows SK Contribution Guidelines (https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
- [ ] The code follows the .NET coding conventions (https://learn.microsoft.com/dotnet/csharp/fundamentals/coding-style/coding-conventions) verified with
dotnet format - [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone :smile:
var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);
Can we just offer this as something like textResult.LastPromptResults.AsJson()? Or is there a good reason to have you bring your own serializer?
good to see this! I just took a quick glance, but I'll spend some time trying it out tomorrow.