InitializeToolset flakiness in correctness builds
Runfo Tracking Issue: Correctness build InitializeToolset issues
| Definition | Build | Kind | Job Name |
|---|---|---|---|
| roslyn-CI | 1979966 | PR 63058 | Test_Windows_Desktop_Spanish_Release_64 |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_CoreClr_UsedAssemblies_Debug |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_Desktop_Debug_64 |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_Desktop_Debug_64 |
| roslyn-CI | 1979918 | PR 63705 | Test_Windows_Desktop_Debug_64 |
| roslyn-CI | 1974358 | Rolling | Test_Windows_Desktop_Debug_64 |
| roslyn-CI | 1974358 | Rolling | Test_Linux_Debug_Single_Machine |
| roslyn-CI | 1974180 | Rolling | Test_Linux_Debug_Single_Machine |
| roslyn-CI | 26178 | PR 64082 | Build_Windows_Debug |
| roslyn-CI | 25431 | Rolling | Correctness_Determinism |
| roslyn-CI | 24910 | Rolling | Correctness_Rebuild |
Build Result Summary
| Day Hit Count | Week Hit Count | Month Hit Count |
|---|---|---|
| 1 | 3 | 7 |
The actual error message I see here is the following:
##error The remote name could not be resolved: 'netcorenativeassets.blob.core.windows.net'
@MattGal is this a known issue?
There's, at least, three different network problems going on here:
- netcorenativeassets DNS resolution
- Timeouts talking to dotnetbuilds.azureedge.net (another storage account)
- Curl failures reaching out to https://dotnet.microsoft.com/download/dotnet/scripts/v1/dotnet-install.sh
@ilyas1974 and the rest of the FR crew have seen some network flakiness issues. FR standup is starting now, I'll stop by there and discuss.
@jaredpar @MattGal Looks like we have this very old (almost one year!) bug tracking what might have been temporary flakiness, but it looks like we're still seeing this every so often. Is there still something needing investigation here?
I don't think this issue is super useful to keep open. External dependencies are going to be occasionally flaky and there are bogus results coming up on this query anyways (the 2nd one is just Helix test failures?). Since we use the publicly available installation scripts, any hardening would need to be handled by that team not DncEng anyways... however given the false positives and infrequent hits I still think this isn't worth keeping open.
Alright, just wanted to confirm. Closing.