openai icon indicating copy to clipboard operation
openai copied to clipboard

Support logprobs for ChatCompletions

Open l-eugine opened this issue 10 months ago • 0 comments

Here is description of the feature from OpenAI - https://cookbook.openai.com/examples/using_logprobs

Basically it requires adding two additional parameters into ChatCompletionCreateRequest

logprobs: 
    Whether to return log probabilities of the output tokens or not. 
    If true, returns the log probabilities of each output token returned in the content of message. 
    This option is currently not available on the gpt-4-vision-preview model.
top_logprobs: 
    An integer between 0 and 5 specifying the number of most likely tokens to return at each token position, 
    each with an associated log probability. logprobs must be set to true if this parameter is used.

This part can be done with existing library, by deriving from ChatCompletionCreateRequest and adding these parameters as

    public class ChatCompletionCreateRequest2 : ChatCompletionCreateRequest
    {
        /// <summary>
        ///     Whether to return log probabilities of the output tokens or not.
        /// If true, returns the log probabilities of each output token returned in the content of message.
        /// This option is currently not available on the gpt-4-vision-preview model.
        /// </summary>
        [JsonPropertyName("logprobs")]
        public bool? LogProbs { get; set; }


        /// <summary>
        /// An integer between 0 and 5 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
        /// logprobs must be set to true if this parameter is used.
        /// </summary>
        [JsonPropertyName("top_logprobs")]
        public int? TopLogprobs { get; set; }
    }

But I could not find and easy way to extend parser of the response. it's hardcoded to parse server response into ChatCompletionCreateResponse, which in turn decode ChatChoiceResponse, which doesn't have a property to capture response from server

  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "NavigateParent()"
      },
      "logprobs": {
        "content": [
          {
            "token": "Navigate",
            "logprob": -6.704273e-7,
            "bytes": [
              78,
              97,
              118,
              105,
              103,
              97,
              116,
              101
            ],
            "top_logprobs": []
          },

So, suggestion is to add two optional parameters mntioned above to ChatCompletionCreateRequest And add LogProbs property to ChatChoiceResponse (same way as it is declared in ChoiceResponse)

    [JsonPropertyName("logprobs")]
    public LogProbsResponse LogProbs { get; set; }

l-eugine avatar Apr 15 '24 09:04 l-eugine