msbuild
msbuild copied to clipboard
NuGetSdkResolver adds significant overhead to evaluation and should not be used
For evaluation time, The NuGetSdkResolver is extremely slow.
Based on investigation in CoreFX, I found:
- On restored state, it added 180-400ms overhead to a single project evaluation
- On unrestored state, it added 1000-6000ms overhead to a single project evaluation
I've looked at the original design, and it's built on a incorrect premise; that its performance impact will only affect the loading state of projects that use it. The image it shows where projects are being loaded in the background via Asynchronous Solution Load (ASL) was removed from the product in 15.5.
If a project opts into this resolver, it has the following effects:
- On solution load, blocks interaction with VS until it has completed downloading & resolving the SDK.
- Increases the time it takes for the project-system to send design-time build results to Roslyn, resulting in delayed IntelliSense results
- Increases the time it takes for designers to open
- Increases the time it takes for tests to be discovered
- Increases the time it takes for VS to react to changing the active configuration
- Increases the time and blocks interaction with VS when making changes to a project, such as adding/removing a file
- Increases design-time builds for every project that references it
- Increases the time for above when switching branches
When we designed SDK resolvers, it was explicitly called out that due to them being used during evaluation - that they must be extremely fast and must not hit the network. While this only hits the network on unrestored state, it still has a large negative impact on evaluation time when the package is already downloaded.
This is the entire reason that NuGet restore does not run during the build while inside Visual Studio.
Rogue resolvers can cause Visual Studio and other IDEs to be blamed for performance and UI delay issues, please remove or change the design of this resolver to play nicely with Visual Studio.
ACK on the negative performance impact. However, taking the feature out at this point is an impossibility.
Could we work together to figure out how to change the current design in a way to meet the perf requirements for VS?
@davkean Do you have the repro steps for how this measurement is taken?
It is very bizarre that the nuget sdk resolver would be any worse than the regular resolver in the restored state.
Putting this in 16.1 based on discussion just now. The resolver is actually now in the nuget repo, but first we should check that it is not something msbuild is doing in calling the resolver that is taking the time.
Loooking at the code, https://github.com/NuGet/NuGet.Client/blob/dev/src/NuGet.Core/Microsoft.Build.NuGetSdkResolver/NuGetSdkResolver.cs
It really doesn't do much in the restored state.
It was doing work, loading files off disk and parsing them if I remember correctly. I wish I could remember how I verified the overhead, I think I isolated in a single project and measured evaluation time. It was the overwhelming cause of the CoreFX evaluation time.
I can repro this:
Evaluation time for dotnet new console: 178 ms
Evaluation time for dotnet new console + MSBuild.Sdk.Extras (unrestored): 3.8 seconds
Evaluation time for dotnet new console + MSBuild.Sdk.Extras (restored): 381 ms
I will profile this simple repro.

Some quick thoughts:
It is spending the most time parsing global.json, which is sad because my global.json is 74 bytes. We should look at switching to the new corefx json reader. There's a sources version for netfx IIRC. This file is trivial, we don't need a fancy deserializer.
That said, it will avoid doing this more than once per build across projects:
https://github.com/NuGet/NuGet.Client/blob/b6cd1677ae2bd3b07f4cc23c2e8d408f784e8b05/src/NuGet.Core/Microsoft.Build.NuGetSdkResolver/NuGetSdkResolver.cs#L89-L90
But I suspect VS evaluations aren't retaining this context object between projects. Fixing that if possible will help non-NuGet sdk evaluation considerably too.
Another major contributor is nuget loading its configuration files. I presume it needs to do this to answer the question of where a downloaded package is because values in the config can impact the answer to that. The parsing there can probably be improved as well.
For both of these we can possibly cache things in static state. and check config timestamps. I did some trickery like that in the non-nuget resolver.
It is spending the most time parsing global.json, which is sad because my global.json is 74 bytes. We should look at switching to the new corefx json reader. There's a sources version for netfx IIRC. This file is trivial, we don't need a fancy deserializer.
You don't have to switch if that's a big short term cost. Just use the JSONReader directly (since the global.json schema is fixed). The corefx reader should be lower allocation regardless though...
So it's mostly Jit time being measured there. 140ms of jitting newtonsoft.json.
😮 wow
Have to check if build at least reuses that jitting. I think it might if the version of nuget in the sdk matches the version used by nuget msbuild resolver. The location of the dll is different for full msbuild.
This time matches almost identical to what I found in https://github.com/dotnet/sdk/issues/1483.

Nope, strike that, JSON.NET was way less than above.
We could just NGEN the one that comes with VS.
@davkean Different API used? This is a huge hammer to read global.json: https://github.com/NuGet/NuGet.Client/blob/b6cd1677ae2bd3b07f4cc23c2e8d408f784e8b05/src/NuGet.Core/Microsoft.Build.NuGetSdkResolver/GlobalJsonReader.cs#L66
Yep, just to grab a couple of strings from that file, hell I betcha regex would beat this hands down.
Yeah, there's a ReadAllText + IndexOf before it to avoid the deserialization, and that's not even showing up. 😆
https://github.com/NuGet/NuGet.Client/blob/b6cd1677ae2bd3b07f4cc23c2e8d408f784e8b05/src/NuGet.Core/Microsoft.Build.NuGetSdkResolver/GlobalJsonReader.cs#L39-L46
I'll study some more tomorrow. This is interesting, but doesn't fully explain seeing 200ms - 400ms per project. I'm seeing 200ms, of which 140ms is Jitting that would not have been happening N times for loading of full solution, right?
Mind you above you said: "I think I isolated in a single project and measured evaluation time". Possible your 200ms - 400ms included jitting?
When I was looking at traces for CoreFx, the resolver showing up as significant, enough to make me isolate this case and measure it by itself. Best way to measure this would be grab a real world use - such as the CoreFx solution I shared.
Yep. Will do that.
The remaining 60ms is still quite bad. Just wondering if the isolation process bloated it to 200ms. I will dig.
Bear in mind that the cost in that solution load will be ballooned by the number of evaluations (we evaluate every single project and configuration because the solution is incomplete) but it should be accurate to the cost of the total evaluation.
The parsed SDK versions from global.json are cached so it should only be read once.
https://github.com/NuGet/NuGet.Client/blob/b6cd1677ae2bd3b07f4cc23c2e8d408f784e8b05/src/NuGet.Core/Microsoft.Build.NuGetSdkResolver/NuGetSdkResolver.cs#L81
MSBuild caches the resolved path for a particular SDK so that should only happen once. But for each unique SDK, the NuGet settings are read. That could be improved.
@nguerrera can you double check that if you have two or more projects referencing the same SDK-from-a-NuGet-package that there is no overhead except for one? Also we still need to figure out if VS is using the EvaluationContext that enables caching of SDK resolution results. If that hasn't been implemented yet, it would add a ton of overhead in VS.
I don't think VS is using the context, (for non-build evaluation) which is probably the biggest issue here.
Adding you to a thread Dave started on that.
Confirming that nuget setting reading is showing up. I mentioned that too with the trace.
Even for happening once or once per sdk and even subtracting jit, it shouldn't need to take so long to read a few tiny files. Probably not what needs to be fixed for the large sln case, though, which I wil be studying next. Above was just thoughts as they came to me looking at a trace for the first time.
The question for evaluation retaining context.State is what's the correct lifetime? It's not as simple as per build is, right? We have to react to global.json / project files changing at some point. For that reason I think we should attack both the cost on cache miss and hit. And separately ensure a good yet correct hit rate.