ApplicationInsights-dotnet icon indicating copy to clipboard operation
ApplicationInsights-dotnet copied to clipboard

Operation Request Count is Inexplicably High

Open dracos1993 opened this issue 3 years ago • 26 comments

  • List of NuGet packages and version that you are using: Microsoft.Extensions.Logging.ApplicationInsights 2.14.0 Microsoft.ApplicationInsights.AspNetCore 2.14.0

  • Runtime version (e.g. net461, net48, netcoreapp2.1, netcoreapp3.1, etc. You can find this information from the *.csproj file): netcoreapp3.1

  • Hosting environment (e.g. Azure Web App, App Service on Linux, Windows, Ubuntu, etc.): App Service on Windows

What are you trying to achieve? We are currently monitoring a web API using the data in the Performance page of Application Insights, to give us the number of requests received per operation.

The architecture of our API solution is to use APIM as the frontend and an App Service as the backend. Both instances have App Insights enabled, and we don't see a reasonable correlation between the number of requests to APIM and the requests to the App Service. Also, this is most noticeable only in a couple of operations.

For example, Apim-GetUsers operation has a count of 60,000 requests per day (APIM's AI instance)

image

AS-GetUsers operation has a count of 3,000,000 requests per day (App Service's AI instance)

image

Apim-GetUsers routes the request to AS-GetUsers and Apim-GetUsers is the only operation that can call AS-GetUsers.

Given this, I would expect to see ~60,000 requests on the App Service's AI performance page for that operation, instead we see that huge number.

I looked into this issue a little bit and found out about sampling and that some App Insights features use the itemCount property to find the exact number of requests. In summary,

  • Is my expectation correct, and if so what could cause this? Also, would disabling adaptive sampling and using a fixed sampling rate give me the expected result?
  • Is my expectation wrong, and if so, what is a good way to get the expected result? Should I not use the Performance page for that metric?

What have you tried so far? Haven't tried a whole lot yet as I don't have access to play with the settings until I can find a viable solution, but I looked into sampling and itemCount property as mentioned above.

I ran a query in Log Analytics on the requests table and when I just used the requests count, I got a number that was closer to the one I see in APIM, but when I use a sum of the itemCount, as suggested by some MS docs, I get that huge number as seen in the performance page.

Sampling is set to 100% on APIM.

dracos1993 avatar Aug 02 '21 15:08 dracos1993

Before I knew of this issue, I opened Azure support ticket 2110270010004247 with the title "Implausibly high request count in Application Insights" on 27 October:

The Performance | Operations blade showed a spike of almost 1 million requests for a particular operation, POST [redacted].

Load balancer logging shows a maximum of 1,706 requests to that route during the 2:52pm minute (local time), with maximum traffic of 177 req/sec.

We use the default Adaptive Sampling from the Microsoft.ApplicationInsights .NET SDK v2.18. Per https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling, 'Metric counts such as request rate and exception rate are adjusted to compensate for the sampling rate, so that they show approximately correct values in Metric Explorer.'

However, this is not 'approximately correct'; our best information indicates that it's off by a factor of 1000×.

For now, our team has disabled the AdaptiveSamplingTelemetryProcessor. Since it's probably too costly to maintain a non-sampling configuration, we would like to perform sampling; however, we currently have low confidence in the accuracy of using adaptive sampling (which is on by default for the .NET SDK).

bgrainger avatar Nov 20 '21 16:11 bgrainger

This issue is stale because it has been open 300 days with no activity. Remove stale label or this will be closed in 7 days. Commenting will instruct the bot to automatically remove the label.

github-actions[bot] avatar Sep 17 '22 00:09 github-actions[bot]

Any feedback from the Azure team on the accuracy of AdaptiveSamplingTelemetryProcessor?

bgrainger avatar Sep 17 '22 05:09 bgrainger

Any updates on this?

We're experiencing the same issue. Our request counts got ~10 times higher than before after upgrading one of the services from .NET Core 3 to .NET 6. These dashboard numbers are driven by high itemCount values. Previously, this service's telemetry wasn't sampled at all (i.e. itemCount == 1). Default sampling settings are used (i.e. adaptive is on, AI sampling is disabled). count() is pretty much the same as it was before which makes it look as if no sampling actually happens but the SDK mistakenly reports high itemCount numbers for some reason.

delight-by avatar Oct 07 '22 05:10 delight-by

+1 Seeing the same issue here as well.

mattramsay-ice avatar Nov 02 '22 15:11 mattramsay-ice

We're experiencing the same issue. Reported operation request count is >5x higher than actual request count. Application Insights Logs shows 3392 items and sum(itemcount) = 37202. While the actual number of requests is 6916.

I cannot tell when this started or even if it has ever been correct. We are still on .net core 3.1. We are using Microsoft.ApplicationInsights.AspNetCore 2.17.0. Application insight reports sdkVersion: aspnet5c:2.17.0+c9d95e701e2474b7eb3b46ae7953b6c7570356ab

Sjaaky avatar Nov 14 '22 19:11 Sjaaky

I believe that my team is seeing this issue, also. I used jMeter to send 200 requests and am seeing 504 requests in AI using sum(itemCount).

ethanturk avatar Nov 28 '22 15:11 ethanturk

Same issue here!

PieterWillemen avatar Nov 29 '22 08:11 PieterWillemen

We're having the same issue. Using .NET Framework 4.72 & AppInsights v2.18. Default settings.

As an example for one operation, AppInsights Failures view shows for the last 4 hours:

  • Count (total): 110.44k.
  • Count (failed): 2.08k

Looking at the same operation from Log workspace shows:

  • Total operations: 212
  • Failed operations: 4

That's off by ~500x.

laurilarjo avatar Jan 15 '23 18:01 laurilarjo

Hi guys,

Also having the same situation for a web app that is being called from APIM. APIM dependency requests are 1000 and on the webapp I have like 2.8k count requests and like 260 samples.

Anyone knows how to fix this?

djpirra avatar Jan 19 '23 12:01 djpirra

@laurilarjo Are you using APIM as well? Trying to see if everyone who is facing this is having APIM ? Also what exactly is the "Log workspace"? It must be the alternate/source of truth?

cijothomas avatar Jan 19 '23 19:01 cijothomas

I believe that my team is seeing this issue, also. I used jMeter to send 200 requests and am seeing 504 requests in AI using sum(itemCount).

what about the count? (not itemCount)

cijothomas avatar Jan 19 '23 19:01 cijothomas

My team is seeing this as well.

dj185057 avatar Jan 20 '23 14:01 dj185057

My team is seeing this as well.

Can you elaborate what exactly is the behavior you are observing? Does it involve APIM? I am looking to gather as much details as possible to narrow this down.

cijothomas avatar Jan 20 '23 15:01 cijothomas

My team is also encountering this issue in a .NET Framework MVC app. We are also using adaptive sampling, and are not using APIM. We have a couple operations where Application Insights recorded ~75k samples (which is very close to what our local 100% sampled logging outside of AI shows), but 11.1 million sum(itemCount).

@cijothomas If you have a good Kusto query to analyze relevant data (sample count/rate, itemCount, etc), I'd be happy to provide an anonymized data set if that would be helpful.

srichards-tb avatar Feb 02 '23 01:02 srichards-tb

With this issue being noticed by more teams, I've been meaning to share a proposed resolution but kept forgetting. Anyway, back when we noticed this issue in our project, we had also raised an Azure service ticket. I'm sharing the summary of the discussion.


Symptom: You noticed a discrepancy in your Azure Web App instance where the operation call count is showing in the millions where in the APIM service operation calls are only showing in the range of about 77k.

Cause: The sampling rate is set to 2.96%, which there is a known issue that sometimes when the sampling percentage is below 10% that the metric reported could be inflated.

Resolution: Raise the sampling percentage to 15% - 20%

We actually made this change, but other things took precedence and we haven't really gone back to this to see if it resolved the issue. I'll try to confirm, but just adding this here FWIW.

dracos1993 avatar Feb 02 '23 02:02 dracos1993

@cijothomas Yes, we're using Azure APIM in front of the service. By Log Workspace, I meant doing manual log search queries in the Log Workspace connected to the same App Insights.

laurilarjo avatar Feb 17 '23 09:02 laurilarjo

My team is also encountering this issue in a .NET Framework MVC app. We are also using adaptive sampling, and are not using APIM. We have a couple operations where Application Insights recorded ~75k samples (which is very close to what our local 100% sampled logging outside of AI shows), but 11.1 million sum(itemCount).

@cijothomas If you have a good Kusto query to analyze relevant data (sample count/rate, itemCount, etc), I'd be happy to provide an anonymized data set if that would be helpful.

This could very well be normal/expected if the sampling ratio is quite low. The best way to see accurate counting would be to look at the standard metrics in Metrics portal. There are standard metrics for Request/count, which should be the actual request count, unaffected by sampling.

In all other cases APIM seems involved, which could be the issue. I suspect its related to/maybe same as https://github.com/microsoft/ApplicationInsights-dotnet/issues/2742, where if someone modifies Activity.TraceFlags, it affects adaptive sampling. Need to check if APIM is doing that intentionally/accidentally.

cijothomas avatar Feb 17 '23 20:02 cijothomas

I'm not aware of anything changing in an ASP.NET Web API app that was using 4.8 and 2.20.0 but we have started seeing a major bump (8-10x) with sum(itemCount) on requests after upgrading the primary consuming application from 4.8 to dotnet 6.0...

The requests count chart on the Application Insights Overview page seems to correlate with the consuming app's dependency sum(itemCount) for the target however.

krompaco avatar Mar 30 '23 14:03 krompaco

Reproducible for Azure Functions .NET 7 Isolated. itemCount constantly growing.

    <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.13.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.ApplicationInsights" Version="1.0.0-preview4" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Timer" Version="4.2.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.0.13" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.9.0" />

timaiv avatar Apr 04 '23 12:04 timaiv

https://www.nuget.org/packages/Microsoft.ApplicationInsights/2.22.0-beta3 has a fix for #2742. @dracos1993 Could you try this out to see if it resolves your issue as well?

vishweshbankwar avatar Apr 24 '23 17:04 vishweshbankwar

I also recently noticed a very high, inaccurate itemCount, 5-times greater, after setting 'Data sampling' to 25%.

seanksullivan avatar May 10 '23 21:05 seanksullivan

My data sampling is 100% and my itemCount looks massively wrong.

Will-Bill avatar Jun 10 '23 19:06 Will-Bill

Just wanted to mention that I too am seeing extremely high request count numbers for us. We're on Asp.Net MVC with .net 4.8. We are not sampling our data in the Application Insights Azure portal settings. Our code is using "defaults" (i.e. we're not modifying anything in web.config).

Hallmanac avatar Sep 07 '23 21:09 Hallmanac

Had this issue for a long time on both a .net 5 and a .net 7 app-service. Upgrading packages Microsoft.ApplicationInsights and Microsoft.ApplicationInsights.AspNetCore to version 2.22.0-beta3 fix the issue on our side!

Janaza avatar Sep 22 '23 14:09 Janaza