aspnetcore icon indicating copy to clipboard operation
aspnetcore copied to clipboard

[main] Update dependencies from dotnet/efcore, dotnet/runtime

Open dotnet-maestro[bot] opened this issue 1 year ago • 5 comments

This pull request updates the following dependencies

From https://github.com/dotnet/runtime

  • Subscription: 32db3699-5666-45da-a1b7-08d8b804cd75
  • Build: 20240224.1
  • Date Produced: February 24, 2024 12:42:55 PM UTC
  • Commit: f32c428c86b4cc41e88e2e5a750c37dfb354e33a
  • Branch: refs/heads/main

From https://github.com/dotnet/efcore

  • Subscription: 8ce5251e-8269-419d-3b2a-08d8b8050dda
  • Build: 20240220.2
  • Date Produced: February 20, 2024 11:57:17 PM UTC
  • Commit: 2113714b2be780918efdb657df92f9a4cf9c06e3
  • Branch: refs/heads/main

dotnet-maestro[bot] avatar Feb 15 '24 13:02 dotnet-maestro[bot]

The test failures are both related to DI service scope validation no longer working. I've filed https://github.com/dotnet/runtime/issues/98551 on the runtime, since this new failure coincides with a change to the service scope validation logic since we last ingested a runtime update.

The other possibility is that our tests were invalid all along, but even if so, it would be a visible runtime behavior change which needs to be understood and confirmed as correct.

SteveSandersonMS avatar Feb 16 '24 11:02 SteveSandersonMS

Microsoft.AspNetCore.Components.E2ETest.Tests.ThreadingAppTest failure.

Should it be fixed by https://github.com/dotnet/aspnetcore/pull/54062 ? I merged main here.

On my machine I see

C:\Dev\aspnetcore\src\Shared\E2ETesting\WaitAssert.cs(129): 
error VSTEST1: (Microsoft.AspNetCore.Components.E2ETest.Tests.ThreadingAppTest.IsStarted) Microsoft.AspNetCore.E2ETesting.WaitAssert.WaitAssertCore[TResult](IWebDriver driver, Func`1 assertion, TimeSpan timeout) System.AggregateException : One or more errors occurred. (Xunit.Sdk.NotEmptyException: Assert.NotEmpty() Failure     
at Xunit.Assert.NotEmpty(IEnumerable collection) in /_/src/xunit.assert/Asserts/CollectionAsserts.cs:line 502     
at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass15_0.<Exists>b__0() in C:\Dev\aspnetcore\src\Shared\E2ETesting\WaitAssert.cs:line 70     
at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass18_0`1.<WaitAssertCore>b__0(IWebDriver _) in C:\Dev\aspnetcore\src\Shared\E2ETesting\WaitAssert.cs:line 101  Screen shot captured 
at 'C:\Dev\aspnetcore\src\Components\test\E2ETest\bin\screenshots\4cd4367237ca46a496bd795ad496936c.png'  
Page content:  <head>      <meta charset="utf-8">      <!-- Forcing the device width here so that our automated tests work consistently on mobile browsers. -->      <meta name="viewport" content="width=1024">      <title>Blazor standalone</title>      <base href="/">      <link href="bootstrap.min.css" rel="stylesheet 

Maybe it means that now the WASM is loaded in the worker and not blocking the first render ?

I'm out of office for a week. @SteveSandersonMS could you please have look ?

My current theory is that the test is looking at the DOM too early. Now that the runtime is not blocking the UI thread anymore, it would finish the page render before starting runtime in the deputy thread. Maybe that's it ?

The app itself works just fine for me image

pavelsavara avatar Feb 22 '24 13:02 pavelsavara

Download failed: server returned code 500. URL: https://edgedl.me.gvt1.com/edgedl/chrome/chrome-for-testing/121.0.6167.85/linux64/chrome-headless-shell-linux64.zip

pavelsavara avatar Feb 22 '24 15:02 pavelsavara

/azp run

BrennanConroy avatar Feb 23 '24 01:02 BrennanConroy

Azure Pipelines successfully started running 3 pipeline(s).

azure-pipelines[bot] avatar Feb 23 '24 01:02 azure-pipelines[bot]

/azp run

mgravell avatar Feb 29 '24 16:02 mgravell

Azure Pipelines successfully started running 3 pipeline(s).

azure-pipelines[bot] avatar Feb 29 '24 16:02 azure-pipelines[bot]

@mangod9 can you take a look at the crash?

https://helix.dot.net/api/2019-06-17/jobs/c512be84-95cc-4395-8ca0-fafd95dfe3d3/workitems/Diagnostics.EFCore.FunctionalTests--net9.0/console

looks like a dump was not uploaded

jeffschwMSFT avatar Mar 01 '24 18:03 jeffschwMSFT

I spilt this PR apart into 2 separate updates:

  • https://github.com/dotnet/aspnetcore/pull/54294 - just update EF Core
  • https://github.com/dotnet/aspnetcore/pull/54297 - just update runtime

To try to isolate what is causing the crash.

eerhardt avatar Mar 01 '24 18:03 eerhardt

@mangod9 can you take a look at the crash?

https://helix.dot.net/api/2019-06-17/jobs/c512be84-95cc-4395-8ca0-fafd95dfe3d3/workitems/Diagnostics.EFCore.FunctionalTests--net9.0/console

looks like a dump was not uploaded

Yeah doesnt look like a dump was uploaded. Wonder if this is with the new exceptions change enabled? Perhaps lets wait for the split jobs from @eerhardt

mangod9 avatar Mar 01 '24 18:03 mangod9

Wonder if this is with the new exceptions change enabled?

No that was disabled a couple days ago, and we know it's disabled because a test failure we were seeing with it enabled is no longer failing. https://github.com/dotnet/runtime/pull/99066

BrennanConroy avatar Mar 01 '24 18:03 BrennanConroy

The dumps seem to indicate an issue with exception handling on macOS x64 runs. Trying to determine whether https://github.com/dotnet/runtime/pull/99117 might be causing it since that seems like the closest change since the last working build from a couple of days ago.

mangod9 avatar Mar 01 '24 21:03 mangod9

The 9.0.0-preview.3.24128.10 runtime build works, so need to find what change broke macOS since then. None look particularly interesting other than a macOS pool change (which shouldn't affect asp.net) and a couple of JIT changes.

mangod9 avatar Mar 02 '24 00:03 mangod9

These are the runtime builds that have happened:

"9.0.0-preview.3.24151.1",   // failed
"9.0.0-preview.3.24129.9",
"9.0.0-preview.3.24129.8",
"9.0.0-preview.3.24129.5",   // failed
"9.0.0-preview.3.24129.3",   // failed
"9.0.0-preview.3.24129.2",   // only changed a STJ test
"9.0.0-preview.3.24129.1",   // worked
"9.0.0-preview.3.24128.10"   // worked

I'm going to using this PR to try to isolate exactly which build introduced the failure, so we can get the smallest commit range.

eerhardt avatar Mar 02 '24 02:03 eerhardt

We've isolated the change between builds:

  • 9.0.0-preview.3.24129.3
  • 9.0.0-preview.3.24129.2

Which means the change is in the range: https://github.com/dotnet/runtime/compare/207f2bb27c188809339eeea6c8405dfc29a35859...1afd4bca1c57685ca26e533d3bfaf1777d237b13

A very good possibility is https://github.com/dotnet/runtime/pull/98117, which looks like it changed the OSX machines we do an official build on. We could be hitting a change/break in the build tools. cc @steveisok

eerhardt avatar Mar 02 '24 04:03 eerhardt