OpenAI-API-dotnet icon indicating copy to clipboard operation
OpenAI-API-dotnet copied to clipboard

Null message on StreamChatEnumerableAsync

Open fgilde opened this issue 1 year ago • 4 comments

Hi, I use your package (v1.6.0) and have a issue. Hope you or someone else can help me

I have this request for example

        var chatRequest = new ChatRequest()
        {
            Model = Model.ChatGPTTurbo,
            Temperature = 0.1,
            MaxTokens = 50,
            Messages = new[]
            {
                new ChatMessage(ChatMessageRole.User, "How to play chess?")
            }
        };

And If I then use

        var response = await Api.Chat.CreateChatCompletionAsync(chatRequest);
        foreach (var c in response.Choices)
        {
            Console.WriteLine(c.Message.Content);
        }

Then it works in general I get a result and everything is fine,

But If I use the StreamChatEnumerableAsync method instead

        await foreach (var result in Api.Chat.StreamChatEnumerableAsync(chatRequest))
        {
            Console.Write(result);
        }

then i Still get a ChatResult but all Message contents are null. Hope somebody can help me

fgilde avatar Mar 17 '23 17:03 fgilde

Api.Chat.StreamCompletionAsync and Api.Chat.StreamChatAsync exhibits identical behavior. It appears ChatResult.Choices[x].Message is null and not being returned.

BTW, it would be great if we could stream Conversation responses.

Thank you for such an awesome library BTW!

Tronald avatar Mar 21 '23 16:03 Tronald

Same problem here. Running this code:

var api = new OpenAI_API.OpenAIAPI("key");
ChatRequest chatreq = new ChatRequest();
           chatreq.Messages = new List<ChatMessage>();
           chatreq.Messages.Add(new ChatMessage { Role = ChatMessageRole.System, Content = "sys msg" });
           chatreq.Messages.Add(new ChatMessage { Role = ChatMessageRole.User, Content = "instruction" });


           await foreach (var token in api.Chat.StreamChatEnumerableAsync(chatreq))
           {
               if (token.Choices[0].Message != null)
               {
//stuff
               }
           }

but

token.Choices[0].Message

is always null.

Jenscaasen avatar Mar 21 '23 16:03 Jenscaasen

From another issue (https://github.com/OkGoDoIt/OpenAI-API-dotnet/issues/80#issuecomment-1469465862), the following can be used instead:

    await foreach (var token in api.Chat.StreamChatEnumerableAsync(chatreq))
    {
        if (token?.Choices[0]?.Delta?.Content != null)
        {
            response += token.Choices[0].Delta.Content;
        }
    }

eptevember avatar Mar 27 '23 01:03 eptevember

@eptevember This works well, thank you!

If you don't use streaming you can get token usage for the last response with:

ChatResult responseInfo = chat.MostResentAPIResult;
Debug.WriteLine(responseInfo.Usage.CompletionTokens);

I can't figure out how to get the same information with streaming. token.Usage is there but always null. Any ideas?

jonjonsson avatar Mar 27 '23 15:03 jonjonsson