azure-functions-host icon indicating copy to clipboard operation
azure-functions-host copied to clipboard

Service bus message actions throwing Grpc.Core.RpcException when sending multiple requests

Open MathBouma opened this issue 1 year ago β€’ 14 comments

Description

This occurs with a C# isolated function, using .NET 8 and latest versions of all libraries. Β  For some reason, when using the message actions with batched service bus messages- some of them will fail with the following error: Call failed with gRPC error status. Status code: 'Unimplemented', Message: 'Service is unimplemented.

This looks similar to Azure/azure-functions-dotnet-worker#1974, however only occurs after multiple requests have been made within one function execution.

My best guess is some kind of limit, be it connections / pooling not being handled correctly.

Steps to reproduce

  • Have a function app which receives a large batch of messages (experienced with ~100).
  • Use message actions to deadletter, abandon or complete a large number of the messages.
  • Have multiple function executions occurring in short time of each other.
    • Short cut could be to fill service bus with a decent number of messages, then call abandon on each message with a large retry set.

Some of the requests will then fail with the GRPC error.

My specific scenario was a function which:

  • Receives a batch of messages.
  • Tries to bulk insert messages into cosmos.
  • Completes successful requests, and abandons failed requests
    • This is because comos can hit rate limits, and only allow a portion of the records to be created. Simply allowing the function to retry or complete would cause issues, as either requests would be duplicated or missed.

Happy to provide more info / source code as needed.

MathBouma avatar Mar 17 '24 21:03 MathBouma

Came looking for an answer on something very similar. We're getting bursts of RpcExceptions when we have a large number of concurrent messages we're processing. The error message isn't the same, but it's an RpcException on call to CompleteMessageAsync. We're also on .NET 8, isolated function, and latest version of all libraries (5.17.0 of Worker.Extentions.ServiceBus).

The message we get, though is Grpc.Core.RpcException: Status(StatusCode="Unavailable", Detail="Error starting gRPC call. HttpRequestException: An error occurred while sending the request. IOException: The request was aborted. IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.. SocketException: An existing connection was forcibly closed by the remote host.", DebugException="System.Net.Http.HttpRequestException: An error occurred while sending the request.")

kanetik avatar Mar 18 '24 19:03 kanetik

Also seeing this in logs of a batched servicebus triggered function using message actions to manually complete/abandon/deadletter all messages. Our logging makes it difficult to trackdown where exactly the issue is cropping up.

.net 8 isolated service bus triggered batch azure function.

partial stack trace


Result: Failure
Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.")

at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88

zmerdev avatar Mar 27 '24 19:03 zmerdev

We have also seen this with low volumes of single messages. Running some tests yesterday (4/16), I saw 2 failures out of 20 test messages right after deploying a minor update in our development environment. Since then I've been unable to reproduce the exception intentionally after running several thousand messages through the function.

We just upgraded the function to .NET 8 Isolated and running the latest release version of all libraries: image

Example Full Stack Trace:

Exception while executing function: Functions.ServiceContracts Result: Failure
Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.")
   at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
   at Datahub.MessageBroker.Function.Consumers.MidmarkServiceBusConsumerBase.HandleMessageExceptionAsync(Exception ex, ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/MidmarkServiceBusConsumerBase.cs:line 182
   at Datahub.MessageBroker.Function.Consumers.MidmarkServiceBusConsumerBase.HandleIncomingServiceBusMessage(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions, String processName, String topicName, String subscriptionName) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/MidmarkServiceBusConsumerBase.cs:line 96
   at Datahub.MessageBroker.Function.Consumers.ASBConsumer.ServiceContractsFunctionAsync(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/ASBConsumer.cs:line 34
   at Datahub.MessageBroker.Function.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Microsoft.Azure.Functions.Worker.Sdk.Generators/Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator/GeneratedFunctionExecutor.g.cs:line 38
   at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88
Stack:    at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
   at Datahub.MessageBroker.Function.Consumers.MidmarkServiceBusConsumerBase.HandleMessageExceptionAsync(Exception ex, ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/MidmarkServiceBusConsumerBase.cs:line 182
   at Datahub.MessageBroker.Function.Consumers.MidmarkServiceBusConsumerBase.HandleIncomingServiceBusMessage(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions, String processName, String topicName, String subscriptionName) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/MidmarkServiceBusConsumerBase.cs:line 96
   at Datahub.MessageBroker.Function.Consumers.ASBConsumer.ServiceContractsFunctionAsync(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Consumers/ASBConsumer.cs:line 34
   at Datahub.MessageBroker.Function.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in /home/vsts/work/1/s/self/Datahub.MessageBroker.Function/Microsoft.Azure.Functions.Worker.Sdk.Generators/Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator/GeneratedFunctionExecutor.g.cs:line 38
   at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88

dmaples-midmark avatar Apr 17 '24 13:04 dmaples-midmark

There seems to be a possible correlation with the RpcExceptions and our function initializing. Since we are in consumption plan, our function is often not running during periods of downtime. The following telemetry is a 6 hour snapshot from Azure showing RpcExceptions overlayed with Process CPU. image

What is interesting here is that all of the exceptions overlap with periods where the function was not reporting Process CPU yet, likely because it was initializing. There may be some time variation between the reporting of these individual metrics that could invalidate this. However, I thought it might be significant enough to share here.

dmaples-midmark avatar Apr 18 '24 18:04 dmaples-midmark

I'm experiencing the exact same behavior and as this wasn't enough I get a ton of object disposed.

These Object Disposed are apparently an issue that can't be fixed. I have spent tremendous amounts of time, trying to understand the problem. Others have same issues: https://github.com/Azure/azure-sdk-for-net/issues/19731

If I use the exact same SB Queue, but instead switch to a Topic everything works fine. What is going on?

It seems very unprofessional from you .NET guys, that the .NET 7.0 Isolated stops support May 10 2024, and we are left with this major bug in .NET 8 Isolated.

I can't imagine how much money there is wasted in the World due to this.

System.ObjectDisposedException: Cannot access a disposed object. Object name: 'System.Net.WebSockets.ClientWebSocket'. at System.Net.WebSockets.ClientWebSocket.get_ConnectedWebSocket() at System.Net.WebSockets.ClientWebSocket.SendAsync(ArraySegment1 buffer, WebSocketMessageType messageType, Boolean endOfMessage, CancellationToken cancellationToken) at Microsoft.Azure.Amqp.Transport.WebSocketTransport.WriteAsync(TransportAsyncCallbackArgs args) at Microsoft.Azure.Amqp.AsyncIO.AsyncBufferWriter.Write(TransportAsyncCallbackArgs args) --- End of stack trace from previous location --- at Microsoft.Azure.Amqp.AsyncResult.End[TAsyncResult](IAsyncResult result) at Microsoft.Azure.Amqp.Transport.AmqpTransportInitiator.<>c.<ConnectAsync>b__17_1(IAsyncResult r) at System.Threading.Tasks.TaskFactory1.FromAsyncCoreLogic(IAsyncResult iar, Func2 endFunction, Action1 endAction, Task1 promise, Boolean requiresSynchronization) --- End of stack trace from previous location --- at Azure.Messaging.ServiceBus.Amqp.AmqpConnectionScope.CreateAndOpenConnectionAsync(Version amqpVersion, Uri serviceEndpoint, Uri connectionEndpoint, ServiceBusTransportType transportType, IWebProxy proxy, String scopeIdentifier, TimeSpan timeout) at Microsoft.Azure.Amqp.FaultTolerantAmqpObject1.OnCreateAsync(TimeSpan timeout, CancellationToken cancellationToken) at Microsoft.Azure.Amqp.Singleton1.GetOrCreateAsync(TimeSpan timeout, CancellationToken cancellationToken) at Microsoft.Azure.Amqp.Singleton1.GetOrCreateAsync(TimeSpan timeout, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.Amqp.AmqpConnectionScope.OpenReceiverLinkAsync(String identifier, String entityPath, TimeSpan timeout, UInt32 prefetchCount, ServiceBusReceiveMode receiveMode, String sessionId, Boolean isSessionReceiver, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.OpenReceiverLinkAsync(TimeSpan timeout, UInt32 prefetchCount, ServiceBusReceiveMode receiveMode, String identifier, CancellationToken cancellationToken)

janus007 avatar Apr 26 '24 06:04 janus007

We are also experiencing the same issue when executing an Azure Function via a Service Bus Trigger (.Net 8 Isolated) when Cosmos returns a number of 429s.

Messages are not placed on the DLQ but a RpcExceptions is thrown.

russaram-bham avatar Apr 26 '24 07:04 russaram-bham

I am also experiencing this issue. I have an Azure Function writing messages in an Azure Service Bus Queue and another Azure Function is triggered by those messages. The exception is thrown on both sides, but most often happens when executing: await messageActions.CompleteMessageAsync(message);

Both Functions are written in .NET 8 and run in an Azure Function App in isolated mode. Both functions connect to Azure Service Bus with AMPQ over TCP. Both functions depend on packages:

<FrameworkReference Include="Microsoft.AspNetCore.App" />
<PackageReference Include="Azure.Messaging.ServiceBus" Version="7.17.4" />
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.21.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.1.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.2.1" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.17.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Timer" Version="4.3.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.17.1" />
<PackageReference Include="Microsoft.ApplicationInsights.WorkerService" Version="2.22.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.ApplicationInsights" Version="1.2.0" />
<PackageReference Include="Microsoft.Extensions.Azure" Version="1.7.2" />

I do see it occasionally in Application Insights. What is the reason? What is the resolution?

Stack Trace:

Result: Failure
Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.")
at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
at *******.Run(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /opt/atlassian/pipelines/agent/build/src/********/*******.cs:line 47
at *******.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in /opt/atlassian/pipelines/agent/build/src/*******/Microsoft.Azure.Functions.Worker.Sdk.Generators/Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator/GeneratedFunctionExecutor.g.cs:line 42
at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88
Stack: at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
at *******.Run(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions) in /opt/atlassian/pipelines/agent/build/src/*******/*******.cs:line 47
at*******.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in /opt/atlassian/pipelines/agent/build/src/*******/Microsoft.Azure.Functions.Worker.Sdk.Generators/Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator/GeneratedFunctionExecutor.g.cs:line 42
at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88

nkalfov avatar Apr 26 '24 14:04 nkalfov

As others have mentioned, I'm also encountering this issue for a couple of weeks. Whether I'm using CompleteMessageAsync, DeadLetterMessageAsync or letting the function autocomplete messages without using ServiceBusMessageActions, I'm encountering these RpcExceptions with status "Unimplemented" or "Unavailable".

Due to the fact that my functions create side effects (update database records, file creation, etc.), not being able to rely on message completion/dead-lettering after processing them causes multiple issues. I'd appreciate any help πŸ™‚.

Vixan avatar May 08 '24 13:05 Vixan

As others have mentioned I am also experiencing this issue when using CompleteMessageAsync and DeadLetterMessageAsync. We are using azure serverless functions with a consumption plan on .NET 8. Would appreciate any help.

wtrombly avatar May 20 '24 20:05 wtrombly

We are also experiencing this issue after upgrading to NET8 and isolated Function base.

AlexEngblom avatar May 23 '24 10:05 AlexEngblom

We're experiencing a very similar issue too. Our Azure Function uses .NET 8 isolated-worker model and runs on a consumption plan. We do not batch messages. The function works fine when the load is low (let's say 20 messages per minute) , but when the function receives more messages than that, it sometimes fails due to the Grpc.Core.RpcException being thrown by the CompleteMessageAsync or DeferMessageAsync methods. Our function processes over 3000 messages daily, and we encounter this issue once or twice almost every day. Although it’s not consistent, there are times when everything works properly, which might be related to slightly lower load on some days.

Stack trace:

Result: Failure
Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.")
   at Microsoft.Azure.Functions.Worker.ServiceBusMes`sageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
   at *.*.*.Run(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions, CancellationToken cancellationToken) in Z:\*\*\*\FUNCTION_FILE.cs:line 87
   at *.*.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in Z:\*\*\*\Microsoft.Azure.Functions.Worker.Sdk.Generators\Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator\GeneratedFunctionExecutor.g.cs:line 38
   at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88
Stack:    at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken) in D:\a\_work\1\s\extensions\Worker.Extensions.ServiceBus\src\ServiceBusMessageActions.cs:line 78
   at *.*.*.Run(ServiceBusReceivedMessage message, ServiceBusMessageActions messageActions, CancellationToken cancellationToken) in Z:\*\*\*\FUNCTION_FILE.cs:line 87
   at *.*.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in Z:\*\*\*\Microsoft.Azure.Functions.Worker.Sdk.Generators\Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator\GeneratedFunctionExecutor.g.cs:line 38
   at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 34
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88

I would appreciate any help. If there's something more I could do to help, please let me know.

MichalDulski avatar Jun 18 '24 15:06 MichalDulski

We are also experiencing this issue after upgrading to NET8 and using isolated Functions. Also using batched mode and message actions to deadletter, abandon, or complete a message.

Exception message: One or more errors occurred. (Status(StatusCode="Unimplemented", Detail="Service is unimplemented."))

Stack trace from app insights:

   at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
   at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification)
   at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionInvoker`2.<>c.<InvokeAsync>b__6_0(Task`1 t) in D:\a\_work\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionInvoker.cs:line 32
   at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke()
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location ---
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)
--- End of stack trace from previous location ---
   at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionExecutor.ExecuteAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionExecutor.cs:line 49
   at ****.Function.DirectFunctionExecutor.ExecuteAsync(FunctionContext context) in D:\a\1\s\src\****\****.Function\src\****.Function\obj\Release\net8.0\Microsoft.Azure.Functions.Worker.Sdk.Generators\Microsoft.Azure.Functions.Worker.Sdk.Generators.FunctionExecutorGenerator\GeneratedFunctionExecutor.g.cs:line 49
   at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 13
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 89
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88

Packages:

<FrameworkReference Include="Microsoft.AspNetCore.App" />
<PackageReference Include="Azure.Messaging.ServiceBus" Version="7.17.5" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.ApplicationInsights" Version="1.2.0" />
<PackageReference Include="Microsoft.ApplicationInsights.WorkerService" Version="2.22.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.22.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.19.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.17.2" />

Errors regularly occurring in App Insights: image

RubenDelange avatar Jun 25 '24 11:06 RubenDelange

Apologies for the delayed response here.

Thank you for the information shared so far, this gives us a good idea of what the potential issue is and will be investigating and posting updates here as we make progress.

This is looking like a host problem, so we'll be transferring this issue to that repo.

fabiocav avatar Jun 26 '24 20:06 fabiocav

I am also experiencing 2 similar issues related to grpc:

Error 1:

Result: Cancelled Exception: Grpc.Core.RpcException: Status(StatusCode="Cancelled", Detail="Call canceled by the client.", DebugException="System.OperationCanceledException: The operation was canceled.") ---> System.OperationCanceledException: The operation was canceled. --- End of inner exception stack trace --- at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.DeadLetterMessageAsync(ServiceBusReceivedMessage message, Dictionary`2 propertiesToModify, String deadLetterReason, String deadLetterErrorDescription, CancellationToken cancellationToken)

Error 2:

Result: Failure Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.") at Microsoft.Azure.Functions.Worker.ServiceBusMessageActions.DeadLetterMessageAsync(ServiceBusReceivedMessage message, Dictionary`2 propertiesToModify, String deadLetterReason, String deadLetterErrorDescription, CancellationToken cancellationToken)

Kalyan-Ad-Shell avatar Jul 01 '24 06:07 Kalyan-Ad-Shell

Hello. Is the issue fixed. Can someone confirm please. I am facing this issue in production.

Kalyan-Ad-Shell avatar Jul 04 '24 12:07 Kalyan-Ad-Shell

After 8 years of ServiceBusHell I just stopped.

I stopped using Service Bus Queue/ Topics, and just developed my own queue using Sql Server and an Eventgrid Topic as trigger. Now I don't face the endless polling for queue messages, I can see exactly what is in the queue table and retry on demand, isolate messages, sessions is working perfect.

I know exactly what happens and when.

I will try to avoid this weird technology in the future, it gives me headache. 8 years was probably enough 😁😁😁

janus007 avatar Jul 04 '24 14:07 janus007

Is this issue fixed. I am still getting these issues. Do I need to upgrade any versions??

Kalyan-Ad-Shell avatar Jul 11 '24 11:07 Kalyan-Ad-Shell

The fix has been merged, we will be including it in the next release.

jviau avatar Jul 11 '24 15:07 jviau

when is this planned to happen?

rafallopatka avatar Jul 15 '24 11:07 rafallopatka

@belfaster, who joined July 2024 ?

janus007 avatar Jul 15 '24 14:07 janus007

Is the Fix released?? If yes, Please mention which package to update and to which version

Kalyan-Ad-Shell avatar Jul 25 '24 17:07 Kalyan-Ad-Shell

@jviau , can you please help confirm which version release fixed this issue?

I'm still facing same problem by below libraries.

<FrameworkReference Include="Microsoft.AspNetCore.App" />
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.23.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.2.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.3.2" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.21.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.17.4" />

vaaan avatar Aug 13 '24 08:08 vaaan

@fabiocav and @jviau - is release 4.1035.1 containing #10255 rolled out in production yet? If not, do you have a schedule for this action?

We still see this "ProblemId" Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException popping up in our App Insights logs daily, cfr. my previous comment for an example call stack.

We use multiple .NET8 functions in a single isolated Function App in combination with Service Bus queues and topics. In App Insights, we can find the Service Bus queue message ids that get impacted, but the logs of our functions during execution are not persisted for the impacted messages.

We do have reasons to believe that the actual logic (for example: saving a record in a database) within our functions is being executed in the back when the Function App is hit by this error. As a result of this exception, the message is retried on the queue (max number of deliveries = 2) with DeliveryCount = 2, resulting in duplicate executions of our logic.

All of this is quite annoying...

RubenDelange avatar Aug 28 '24 10:08 RubenDelange

Facing same issue, logic of the azure function is getting executed twice. First time in applicaiton insights I am seeing below issue and then it retries with deliverycount=2. But behind the scenes it is getting executed successfully even on the first try.

Exception: Grpc.Core.RpcException: Status(StatusCode="Unimplemented", Detail="Service is unimplemented.")

krishna-pandey-git avatar Aug 29 '24 08:08 krishna-pandey-git

As a temporary solution until this is fully released, we pinned azure function host to 4.1035.2 using FUNCTIONS_EXTENSION_VERSION environment variable.

This didn't full fix the exceptions, we are still getting them, but much much less then before.

Note that pinning method is different for Windows and Linux so be sure to read the correct documentation:

https://learn.microsoft.com/en-us/azure/azure-functions/set-runtime-version?tabs=azure-portal&pivots=platform-windows#manual-version-updates-on-linux

tonibgd avatar Sep 02 '24 10:09 tonibgd

As a temporary solution until this is fully released, we pinned azure function host to 4.1035.2 using FUNCTIONS_EXTENSION_VERSION environment variable.

This didn't full fix the exceptions, we are still getting them, but much much less then before.

Note that pinning method is different for Windows and Linux so be sure to read the correct documentation:

https://learn.microsoft.com/en-us/azure/azure-functions/set-runtime-version?tabs=azure-portal&pivots=platform-windows#manual-version-updates-on-linux

The documentation says: If you specify only the major version (~4), the function app is automatically updated to new minor versions of the runtime as they become available, so I would assume that the latest version update (incl. this fix) would be loaded using ~4 when it is actually rolled out. It's not like a newer version introduced this annoying behavior and we want to roll back to a previous version.

So long story short, I don't understand why pinning v4.1035.2 or any more recent version could help in this case. Unless I'm misreading the documentation of course.

RubenDelange avatar Sep 03 '24 11:09 RubenDelange

The fix has been merged, we will be including it in the next release.

I'm using ~4 and still getting this problem occasionally, so it doesn't appear that the fix on the host side worked. Do we know if the rpc exception is transient, so that we can at least mitigate the issue by wrapping the call with a retry policy?

joeyeng avatar Sep 19 '24 22:09 joeyeng

Still no fix/workaround? I am facing this issue occasionally when trying to call method DeadLetterMessageAsync and internally this error is raised from DeadletterAsync from SettlementClient

MHalchenko avatar Sep 24 '24 13:09 MHalchenko

Is it fix? It's really breaking our app in production. Do you know in which version this problem does not occur? Thanks

petrkasnal avatar Sep 25 '24 07:09 petrkasnal

Please advice if this is fixed. We are still getting this error.

prasangastha7 avatar Oct 01 '24 23:10 prasangastha7