ApplicationInsights-dotnet
ApplicationInsights-dotnet copied to clipboard
Major Performance Issues
- List of NuGet packages and version that you are using: Microsoft.ApplicationInsights.AspNetCore 2.18.0
- Runtime version (e.g. net461, net48, netcoreapp2.1, netcoreapp3.1, etc. You can find this information from the
*.csproj
file): netcoreapp3.1 - Hosting environment (e.g. Azure Web App, App Service on Linux, Windows, Ubuntu, etc.): IIS 8.5 on Windows Server 2012 R2
What are you trying to achieve? We're having massive performance slowdowns from Application Insights in production for one specific web application. Certain requests go from approximately 2 seconds without AppInsights to 10-12 seconds with it turned on. This Web API gets about 2 requests / second and we suspect it is flushing the data to Azure too frequently.
Frustratingly, we are only encountering this in prod (or dev/uat environments don't experience the same kind of load).
What have you tried so far? Our AppInsights is set up using service.AddApplicationInsightsTelemetry() so I'm assuming its using the ServerTelemetryChannel based on the documentation. I've explored the many options on ServerTelemetryChannel but there are so many of them and they all seem so similar I'm not sure which ones to try tweaking to try and improve this scenario.
I'd hoping for some guidance as to what settings to try tweaking to fix the performance issue App Insights is causing our prod server.
Note: We are not manually calling Flush() anywhere.
Thanks.
we suspect it is flushing the data to Azure too frequently
Is there any reason why you believe so? The actual transmission of telemetry to Azure is not done in the user request thread, so it won't impact response time.
we suspect it is flushing the data to Azure too frequently
Is there any reason why you believe so? The actual transmission of telemetry to Azure is not done in the user request thread, so it won't impact response time.
Any other ideas what it could be? We literally can toggle AppInsights on/off via config file and see the massive perf difference.
We've been facing this slowing down issue sometimes in a day and it has started a few weeks ago. When it starts to slow down, we simply restart the app. We didn't find a solution so we've turned off AppInsights by deleting instrumentationKey in config, and now the slowing downs gone. Our application runs on Azure AppService. We've tried updating nuget package, scaling up our app, updating net5 to net6, but none of them solved this issue. Also we don't send any custom telemetry, only standard setup of AppInsights. Our AppServices are located on North Europe on Azure.
Can you share a minimal repro app for us to take a look?
Can you share a minimal repro app for us to take a look?
I would share an app for this but our apps can be reached only by vpn, so I'll open an Azure ticket, and reference this github issue.
I cannot provide a repro. It only happens in our Production environment. We don't know the cause. If I could repro I probably wouldn't need help. :)
I'm really looking for some guidance/best practices as to what settings in https://github.com/microsoft/ApplicationInsights-dotnet/blob/develop/BASE/src/ServerTelemetryChannel/ServerTelemetryChannel.cs are worth messing around with. I can't imagine you guys exposed all those things and never intended anyone to use any of them.
Check the document on most commonly used channel settings: https://docs.microsoft.com/en-us/azure/azure-monitor/app/telemetry-channels#configurable-settings-in-channels
From the description you gave (latency increase from 2secs to 12 secs), and NO Flush is called, I don't think it is a channel issue.
You could selectively disable telemetry modules/features and see if impacts the perf https://docs.microsoft.com/en-us/azure/azure-monitor/app/asp-net-core#using-applicationinsightsserviceoptions to narrow down the issue.
This issue is stale because it has been open 300 days with no activity. Remove stale label or this will be closed in 7 days. Commenting will instruct the bot to automatically remove the label.