semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

.Net Bug: In Blazor - Unhandled Exception: System.NotSupportedException: Synchronous reads are not supported, use ReadAsync instead.

Open AshD opened this issue 1 year ago • 2 comments

Describe the bug In .NET 8 this works fine but under Blazor, Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync throws an exception.

Unhandled Exception: System.NotSupportedException: Synchronous reads are not supported, use ReadAsync instead. at System.Net.Http.WasmHttpReadStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.IO.DelegatingStream.Read(Byte[] buffer, Int32 offset, Int32 count) at Azure.Core.Pipeline.ReadTimeoutStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.IO.Stream.CopyTo(Stream destination, Int32 bufferSize) at System.IO.Stream.CopyTo(Stream destination) at Azure.RequestFailedException.BufferResponseIfNeeded(Response response) at Azure.RequestFailedException.GetRequestFailedExceptionContent(Response response, RequestFailedDetailsParser parser) at Azure.RequestFailedException..ctor(Response response, Exception innerException, RequestFailedDetailsParser detailsParser) at Azure.RequestFailedException..ctor(Response response, Exception innerException) at Azure.RequestFailedException..ctor(Response response) at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken) at Azure.AI.OpenAI.OpenAIClient.GetChatCompletionsStreamingAsync(ChatCompletionsOptions chatCompletionsOptions, CancellationToken cancellationToken) at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.<RunRequestAsync>d__551[[Azure.AI.OpenAI.StreamingResponse1[[Azure.AI.OpenAI.StreamingChatCompletionsUpdate, Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]], Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]].MoveNext() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()

To Reproduce Call Microsoft.SemanticKernel.Connectors.**OpenAI.ClientCore.GetStreamingChatMessageContentsAsync running as a Blazor WASM app.

Expected behavior Works like it does in regular .NET 8

Platform

  • OS: Windows 11 Version 10.0.22631 Build 22631
  • IDE: Visual Studio 17.10.3
  • Language: C#, Blazor WASM app
  • Source: Microsoft.SemanticKernel Nuget 1.15.0

Additional context I passed it a new HttpClient like this var httpClient = new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) };

AshD avatar Jun 27 '24 21:06 AshD

I am creating the completion service to OpenAI like this (and it works in a Windows WPF app) and throws an exception in Blazor WASM app.

var httpClient = new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) }; chatCompletionService = new OpenAIChatCompletionService(AIService.ModelName, AIService.Key,null,HttpClient);

AshD avatar Jun 27 '24 21:06 AshD

Some more info: The Azure.AI.OpenAI dll 1.0.0 beta 17 is not sending the correct JSON to the OpenAI endpoint under Blazor. It is missing the Content field in the messages sent.

{"messages":[{"role":"system"},{"role":"user"}],"max_tokens":30000,"temperature":0,"top_p":1,"n":1,"stop":["user:","User:","Question:","ZZ"],"presence_penalty":0,"frequency_penalty":0,"stream":true,"model":"gpt-4o"}

AshD avatar Jun 28 '24 15:06 AshD

im seeing exactly the same issue using wasm / avaloniaui.. Desktop build is working, wasm is missing message content

pkellyuk avatar Aug 27 '24 16:08 pkellyuk

Same here, let us know when the fix will be released

SH2015 avatar Sep 10 '24 02:09 SH2015

After trying to reproduce this error in the latest version of Semantic Kernel. (1.33) I was not able to reproduce the above problem.

Here's what I have done:

  1. Installed a new Blazor WebAssembly Standalone App for .Net 8.
  2. Installed Microsoft.SemanticKernel package
  3. Added Microsoft.SemanticKernel and Microsoft.SemanticKernel.ChatCompletion namespaces to _Imports.razor
  4. Changed Counter.razor implementation to the one provided below.
  5. Run the application and got the expected Chat Response from llm as expected. Image

Counter.razor

@page "/counter"
@using Microsoft.SemanticKernel.Connectors.OpenAI

<PageTitle>Counter</PageTitle>

<h1>Counter</h1>

<button class="btn btn-primary" @onclick="GetLLMResponseAsync">Ask LLM</button>

<p role="status">LLM Response: @llmResponse</p>

@code {
    private string llmResponse = string.Empty;

    private async Task GetLLMResponseAsync()
    {
        var modelId = "gpt-4o";
        var apiKey = "<redacted>";

        var chatCompletion = new OpenAIChatCompletionService(modelId, apiKey);
        var response = await chatCompletion.GetChatMessageContentAsync("Hello, how are you?");

        llmResponse = response.ToString();
    }
}

For AzureOpenAI the result was successful as well:

@page "/counter"
@using Microsoft.SemanticKernel.Connectors.AzureOpenAI

<PageTitle>Counter</PageTitle>

<h1>Counter</h1>

<button class="btn btn-primary" @onclick="GetLLMResponseAsync">Ask LLM</button>

<p role="status">LLM Response: @llmResponse</p>

@code {
    private string llmResponse = string.Empty;

    private async Task GetLLMResponseAsync()
    {
        var modelId = "gpt-4o";
        var apiKey = "<redacted>";
        var endpoint = "<redacted>";
        var chatCompletion = new AzureOpenAIChatCompletionService(modelId, endpoint, apiKey);
        var response = await chatCompletion.GetChatMessageContentAsync("Hello, how are you?");

        llmResponse = response.ToString();
    }
}

rogerbarreto avatar Jan 16 '25 10:01 rogerbarreto