[Bug] MeasureDurationResult incorrect when running in linux
Library version used
4.60.3
.NET version
C# .net 8.0
Scenario
ManagedIdentityClient - managed identity
Is this a new or an existing app?
The app is in production, I haven't upgraded MSAL, but started seeing this issue
Issue description and reproduction steps
When our unit tests are run using a linux distro, the AuthenticationResult->AuthenticationResultMetadata->DurationTotalInMs is returning a incorrect time. This is due to the Stopwatch.Frequency being 10000000 on windows and 1000000000 on linux, this is not taken into account when performing the calculation.
Relevant code snippets
public MeasureDurationResult(long ticks)
{
Milliseconds = ticks / TimeSpan.TicksPerMillisecond;
Microseconds = ticks / (TimeSpan.TicksPerMillisecond / 1000);
Ticks = ticks;
}
Expected behavior
Regardless of O/S, the DurationTotalInMs should be the same, however, on linux, the DurationTotalInMs is incorrect
Windows: ElapsedTicks: 8423086 DurationTotalInMs: 842
Linux ElapsedTicks: 8423086 DurationTotalInMs: 84230
Identity provider
Other
Regression
No response
Solution and workarounds
No response
@SingleCopy just to confirm, is this for Managed Identity? Because you're calling out ADFS as the Identity Provider later.
Sounds like we need to update the converstion factor (1000.0 / Stopwatch.Frequency).
@localden Correct, this is for Managed Identity, I have updated my description.