Work around a JIT failure when running net472 MSBuild on .NET 8
Fixes #9869
Context
Running .NET Framework MSBuild in a .NET process is unsupported. Apparently, though, until #9446 it was working.
Changes Made
Worked around the regression by calling .NET Framework-only API in a separate non-inlinable method.
Testing
The test project provided by @MarkKharitonov.
Notes
NB: The scenario is still unsupported and there are no guarantees that it will be working in the future.
@MarkKharitonov, please give this a try.
It works.
Could you please explain the essence of the fix for my education? I have looked at the PR change and saw the note about inlining, but I did not understand it.
Thank you very much.
The problematic code is behind an if and while in your scenario it does not run, the call still fails to be resolved at JIT time. The fix moves the call to another method to make sure that it's seen by JIT only when actually called.
Ah, of course. Similar to the MSBuildLocator pattern where we must locate the msbuild and invoke msbuild in different functions, so that JIT does not attempt to compile msbuild usage before we locate it. I get it.
Makes total sense now.
Hopefully your change is merged and deployed soon.
I would lean slightly towards keeping this working and adding a hard break post-17. But if you feel strongly about it, I have no issue with closing this.
Would you be able to submit a Pull Request for my test case repository at https://github.com/MarkKharitonov/test_load_msbuild_project that exemplifies the correct approach?
Guys, would it be possible to modify my test case at https://github.com/MarkKharitonov/test_load_msbuild_project to demonstrate how a single app can manipulate both SDK and legacy projects? Our enterprise application still uses .NET Framework and even though almost all the projects are SDK style, there are a few (typically using WCF) which are legacy.
We have important tooling processing all of the projects and this move from 17.9 to 17.10 breaks it. I do not mind changing our code to process it correctly, but a working example would make it much easier for us.
Now I realize everyone is busy and therefore as a compromise, maybe this PR should be merged after all? This will release the pressure and you will be able to provide a working correct example at ease.
@MarkKharitonov apologies for the delay, busy times indeed. I'll leave it up to the team to decide if they want to take this fix. But even if it is taken, it will only appear in .NET 9 / VS 17.11.
As for the right approach, I believe the guidance is straightforward: When you want to manipulate legacy projects, you should use the MSBuild that comes with VS, not the one in .NET SDK. And because VS MSBuild is designed to run on .NET Framework, it must be hosted in a .NET Framework app.
Could you provide some more concrete example?
In my test case, I process 3 kinds of projects:
- Legacy (non SDK style) - works fine.
- SDK style targeting .NET Framework - fails.
- SDK style targeting .NET 8 - fails.
And the processor itself:
- targets .NET 8.
- Uses msbuild libraries from the VS installation (by guiding the
MSBuildLocatorto the VS directory).
In reality I do not need to support all the 3 kinds of projects - only legacy and SDK style targeting the .NET Framework is required. I would really like to develop the processor utilizing the full power of the modern .NET, i.e. target the latest .NET. It just does not make sense to develop unicorn processor tools for enterprise source code using the same outdated tech as the enterprise code itself. Modernizing the enterprise code takes time, a lot of time. This processor tool is part of the modernization effort to automate certain chores.
Given this context, what are my options? What is the correct approach to develop that processor?
I would really like to develop the processor utilizing the full power of the modern .NET, i.e. target the latest .NET.
This is not possible.
It just does not make sense to develop unicorn processor tools for enterprise source code using the same outdated tech as the enterprise code itself.
Sorry, this is what is required today.
It was working so far. And the change in this PR has zero cost from all the perspectives - maintenance, performance, code quality, etc...
This change alone is enough to let my code parse C# project files using MSBuild API while enjoying all the latest .NET goodies. Maybe more sophisticated usages of the MSBuild API are forbidden in my circumstances, but I do not need them. Maybe you guys have further serious changes in stock that will break it, but for now this AppDomain business is the only obstacle.
It would be very helpful if this change is merged.
If the tooling targets .NET Framework, then how can such a tool be packed as a dotnet tool?
I played with it a little bit and get the following error running dotnet pack:
C:\Program Files\dotnet\sdk\8.0.206\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.PackTool.targets(117,5): error NETSDK1054: only supports .NET Core. [C:\xyz\CSTool\CSTool\CSTool.csproj]
This seems to be mentioned here - dotnet tools must target .NET Core. Am I missing anything? Does it mean that if we are forced to target the .NET Framework in the tooling, then we can no longer distribute it as a dotnet tool?
Does it mean that if we are forced to target the .NET Framework in the tooling, then we can no longer distribute it as a dotnet tool?
You could probably play some tricks like having a dotnet tool that only invokes the .NET Framework app it carries as Content, if you really wanted to distribute as a tool.
PM take on this: we should not encourage in any way unsupported use of the MSBuild tool. This just increases the support matrix/load on the team and encourages users to use the tool in ways that may subtly break at any time. I am fully in support of @rainersigwald's desires to break more explicitly when this torn state is detected.
There is ton of legacy .NET Framework code. What you are doing by declaring this scenario as unsupported is doom all the tools that manipulate it and which employ the MSBuild API to be written in .NET Framework as well. Pretty harsh constraint.
@MarkKharitonov it's one we have to live with, too. I wish we didn't! But this is what we have.