templating icon indicating copy to clipboard operation
templating copied to clipboard

Enable AUTHORING UX approval tests creation

Open DavidKarlas opened this issue 4 years ago • 22 comments

Background

Today there is no good story for template authors to test their templates and ensure they work as intended after doing changes and environment around them changes(.NET Framework, Template Engine...)

We have https://github.com/dotnet/templating/tree/main/tools/ProjectTestRunner but its pretty hard to understand and navigate for novice template author, also tooling is not compiled as dotnet global tool to just use... We run tests in our repo using Process.Start("dotnet new console) and than check output see example here.. Again, not very good way for template author to run/maintain tests.

Outcomes

This enables template development inner loop. We want to support approval tests, which means template author would do:

  1. dotnet new console once to create initial content
  2. On test run TemplateEngine will provide ability to run tests that compare content from step 1) with what it would generate, if change is intentional author re-runs step 1) and commits to git changes.

Justification

  • Customer impact - 1st party customers has easy tools to test their templates (popular request)
  • Engineering impact - automated testing contribute to less bugs - automated testing allows to reduce amount of manual testing - teams don't need to invent and support own tooling for testing

Prerequisite

What needs to be solved how to handle random values like PortNumber or GUIDs...

Subtasks

Investigations:

  • [x] Get stats on usage of nondeterministic generators (Guid, Now, Port, Random) - @vlada-shubina
  • [x] Investigate ways of usage of XUnit Verifier so that multiple verification can be performed and reported (even if multiple are failing)
    • Verify.Net doesn't support verification of multiple files at the moment.
    • Simon is considering to implement it in near future
    • We stick to 1-by-1 file comparison at the moment
  • [x] Go through CommonTemplatesTests to access what functionality we'll need from the test framework in order to transform and adopt these tests.
    • stdout and stderr content comparison
    • content regex matching
    • content substring matching, absence of patterns
    • newlines normalization
    • custom content checking (xml parsing)
  • [x] Investigate options to programaticaly change dotnet sdk version to run a sdk tool (as a fallback we can programatically create and then discard global.json) We will leverage global.json for this - simplified approach:
ren global.json global.json.bak
dotnet new globaljson --sdk-version <version>
ren global.json.bak global.json

Subtasks for MVP (not to be exposed to customers):

  • [x] Generalize and repurpose Microsoft.TemplateEngine.TemplateLocalizer as templates authoring toolset. Packaged as nuget - @vlada-shubina
  • [x] Define configuration model for a single test case ({template to be tested; dotnet sdk version; parameter values; approvals location}). Create System.CommandLine Parser transforming CLI arguments to this configuration model
  • [x] Verification logic module (the API and actual logic doesn't have to be polished for first version) - @JanKrivanek
  • [x] Add programatic way of simple scrubbing and/or replacing keyed by files.
  • [ ] Transform and onboard CommonTemplatesTests to the new framework

V2 (preparation for customers exposed toolset):

  • [x] Define Verification module API. CLI and MSBuild interfaces should call the logic through the API
  • [ ] Extract external process wrapping logic (Command) from Microsoft.DotNet.Cli.Utils or find another utility for wrapping CLI processes - and get rid of copied code within the Microsoft.TemplateEngine.Authoring.TemplateVerifier. (this task might be joined with https://github.com/dotnet/templating/issues/5296)
  • [ ] Support batch execution logic for multiple test cases (probably configured by files)
  • [x] Support filter/ignore lists (with defualt behavior that should suffice in most common cases) - e.g. to be able to ignore images, bin/* outputs etc.
  • [x] Support for not installed templates (arbitrary location on disk)
  • [ ] Support for switching sdk versions
  • [ ] Documenting API and CLI in docs/wiki

Next iterations (ideally part of the first customer facing version):

  • [ ] Review (and adjust if needed) signing of tooling imposed by source build - is it required for shipping?
  • [ ] Add telemetry
  • [ ] Implement context detection and extraction for nondeterministic generators handling (so e.g. for Port generator, the logic should be able to detect the resulting value in the generated output and then process the output by replacing all instances of the generator being used).
  • [ ] Add Template Validator as another tool in the authoring toolset. Implement just a sample of most importatnt validations (more comprehensive list: https://github.com/dotnet/templating/issues/2623)
    • [ ] Create MSBuild Task version of the Template Validator
    • [ ] Design and use continuable errors during validation - so that as much errors as possible cna be reported during single run (while not reporting nonsense issues caused by inconsistent data detected in previous steps).
  • [ ] Investigate, Design and implement deterministic mode for Macros (and hence generators)

DavidKarlas avatar Sep 06 '21 12:09 DavidKarlas

@vlada-shubina @DavidKarlas a number of repos are starting to create .NET 7 templates (e.g, https://github.com/dotnet/winforms/pull/6206), and having a test infra for templates could be very helpful.

RussKie avatar Nov 22 '21 01:11 RussKie

This feature will be useful for testing: https://github.com/dotnet/templating/issues/3418 to avoid the need of using custom settings location and installing the template.

vlada-shubina avatar Aug 01 '22 09:08 vlada-shubina

Based on brainstroming session with @vlada-shubina those are the tasks we came up with:

Investigations:

  • [ ] Get stats on usage of nondeterministic generators (Guid, Now, Port, Random) - @vlada-shubina
  • [x] Investigate ways of usage of XUnit Verifier so that multiple verification can be performed and reported (even if multiple are failing)
    • Verify.Net doesn't support verification of multiple files at the moment.
    • Simon is considering to implement it in near future
    • We stick to 1-by-1 file comparison at the moment
  • [x] Go through CommonTemplatesTests to access what functionality we'll need from the test framework in order to transform and adopt these tests.
    • stdout and stderr content comparison
    • content regex matching
    • content substring matching, absence of patterns
    • newlines normalization
    • custom content checking (xml parsing)
  • [x] Investigate options to programaticaly change dotnet sdk version to run a sdk tool (as a fallback we can programatically create and then discard global.json) We will leverage global.json for this - simplified approach:
ren global.json global.json.bak
dotnet new globaljson --sdk-version <version>
ren global.json.bak global.json

Subtasks for MVP (not to be exposed to customers):

  • [ ] Generalize and repurpose Microsoft.TemplateEngine.TemplateLocalizer as templates authoring toolset. Packaged as nuget - @vlada-shubina
  • [ ] Define configuration model for a single test case ({template to be tested; dotnet sdk version; parameter values; approvals location}). Create System.CommandLine Parser transforming CLI arguments to this configuration model
  • [ ] Verification logic module (the API and actual logic doesn't have to be polished for first version) - @JanKrivanek
  • [ ] Add programatic way of simple scrubbing and/or replacing keyed by files.
  • [ ] Transform and onboard CommonTemplatesTests to the new framework

V2 (preparation for customers exposed toolset):

  • [ ] Define Verification module API. CLI and MSBuild interfaces should call the logic through the API
  • [ ] (In Progress) Implement context detection and extraction for nondeterministic generators handling (so e.g. for Port generator, the logic should be able to detect the resulting value in the generated output and then process the output by replacing all instances of the generator being used).
  • [ ] Support batch execution logic for multiple test cases (probably configured by files)
  • [ ] Support filter/ignore lists (with defualt behavior that should suffice in most common cases) - e.g. to be able to ignore images, bin/* outputs etc.

Next iterations (ideally part of the first customer facing version):

  • [ ] Add telemetry
  • [ ] Add Template Validator as another tool in the authoring toolset. Implement just a sample of most importatnt validations (more comprehensive list: https://github.com/dotnet/templating/issues/2623)
    • [ ] Create MSBuild Task version of the Template Validator
    • [ ] Design and use continuable errors during validation - so that as much errors as possible cna be reported during single run (while not reporting nonsense issues caused by inconsistent data detected in previous steps).
  • [ ] Investigate, Design and implement deterministic mode for Macros (and hence generators)

JanKrivanek avatar Aug 19 '22 13:08 JanKrivanek

Great plan!

  • Investigate ways of usage of XUnit Verifier so that multiple verification can be performed and reported (even if multiple are failing)

This point sounds strange... Perhaps I don't quite understand the intent here, could you elaborate on this please?

RussKie avatar Aug 23 '22 05:08 RussKie

We plan to use Verify.NET to build the framework. We need to investigate if it can do verification of multiple files out of the box. So we far we are using it only for single file\object validation.

vlada-shubina avatar Aug 23 '22 07:08 vlada-shubina

We use Verify in Windows Forms repos quite a bit, and I don't think it's possible to verify multiple files simultaneously, such verifications need to be serialized. That is, how do you present multiple failures in a diff tool? This is how we verify multiple files in a single test:

    protected async Task VerifyAsync(
        IVisualStudioDocument mainFile,
        IVisualStudioDocument codeBehindFile,
        [CallerMemberName] string testMethodName = "")
    {
        var (TestMethodName, AdditionalPath) = GetTestContext(testMethodName);

        await VerifyAsync(mainFile.GetTextBuffer().CurrentSnapshot, $"{TestMethodName}_{MainFileSuffix}", AdditionalPath);
        await VerifyAsync(codeBehindFile.GetTextBuffer().CurrentSnapshot, $"{TestMethodName}_{CodeBehindFileSuffix}", AdditionalPath);
    }

    private static Task VerifyAsync(ITextSnapshot textSnapshot, string methodName, string additionalPath)
        => Verifier.Verify(textSnapshot.GetText())
            .UseDirectory($@"TestData\{additionalPath}")
            .UseFileName(methodName);

Pining @SimonCropp for sharing his thoughts on this.

RussKie avatar Aug 23 '22 07:08 RussKie

@RussKie yeah thats not ideal. where do i find that code so i can have a go a making it better?

SimonCropp avatar Aug 23 '22 12:08 SimonCropp

Just to explain our use case better: we would like to verify the template output, that can be N files with random folder structure. Example:

  • src
    • Project1
      • project 1 content goes here (multiple files)
    • Project2
      • project 2 content goes here (multiple files)
  • test
    • test files go here

The template output is pretty static (with some exceptions like guids, dates etc) so approval tests seems to be good match to test them.

Ideally we do the following:

await VerifyFiles(pathToTemplateOutput); 

and this produces the snapshot identical to folder/files in the pathToTemplateOutput folder. Ideally in case of failure, all the failures will be shown in the one window. Certain settings decorators should be still applicable. Initial plan was to similar to what @RussKie mentioned above, but we would have many files to check and not to fail on the first incorrect file.

Imho, this use case (verifying all the files in the folder) is pretty generic and implementing it in Verify.NET would be of benefit.

vlada-shubina avatar Aug 23 '22 13:08 vlada-shubina

Thank you for the context, now I think I get it - executing a template will produce several files, and each template may have a different number of those files. So essentially we need to verify a folder content. I don't think it'd be difficult to create a helper method that verifies a content of a folder, though it'd make our lives easier if that helper was provided out of a box. :)

In Windows Forms scenarios we generally compare one or two known files (more like entities that represent files, and those may not even exist on disk).

Ideally in case of failure, all the failures will be shown in the one window.

I can interpret this wish in a few different ways, though not sure if my interpretation aligns with yours.

  • Tooling: some diff/merge tools provide folder comparison functionality (e.g., BeyondCompare 4 IIRC) but a lot of tools don't.
  • Test result presentation: collecte results from all failed file verifications and concatenate them together. It's possible to catch every "verification failed exception" (there's a specific exception type, I don't remember the name of it), and then provide a custom report to the user. Depending on how much data is presented to a developer this may degrade the developer experience.

RussKie avatar Aug 24 '22 00:08 RussKie

Thank you @SimonCropp for jumping in. In essence, here we're talking about verifying the result of dotnet new command. E.g., dotnet new winforms, dotnet new winforms -n MyApp, dotnet new winforms -f net5.0, etc. will produce folders with own sets of files, and we'll need to verify the content of each file. (@vlada-shubina please correct me if I misinterpret it) We will also need to be able to exclude some folders (e.g., obj).

D:\Development\throwaway\foo>dotnet new winforms -f net5.0
The template "Windows Forms App" was created successfully.

Processing post-creation actions...
Restoring D:\Development\throwaway\foo\foo.csproj:
  Determining projects to restore...
  Restored D:\Development\throwaway\foo\foo.csproj (in 68 ms).
Restore succeeded.


D:\Development\throwaway\foo>dir /b
foo.csproj
foo.csproj.user
Form1.cs
Form1.Designer.cs
obj
Program.cs

Do you think it's a worthy addition to your already awesome library?

The sample I mentioned earlier is in a close source, but I'm sure @vlada-shubina or @JanKrivanek may be able to explain their test procedures in more details.

RussKie avatar Aug 24 '22 00:08 RussKie

I don't think it'd be difficult to create a helper method that verifies a content of a folder, though it'd make our lives easier if that helper was provided out of a box. :)

I can make this happen

SimonCropp avatar Aug 24 '22 02:08 SimonCropp

Thank you :)

P.S. Didn't mean to nerdsnipe you

RussKie avatar Aug 24 '22 02:08 RussKie

@RussKie no worries.

So given a method VerifyDirectory(dirPath), for a TheTestClass, and a TheTestMethod... give me some rules as to how we define the target directory where the snapshot files go.

SimonCropp avatar Aug 24 '22 02:08 SimonCropp

also, it would be helpful to have a target repository where u want this approach applied. so i can smoke test the new feature in a PR for u

SimonCropp avatar Aug 24 '22 02:08 SimonCropp

Added couple of test drafts in https://github.com/dotnet/templating/pull/5163. Note that this syntax is not set in stone, it may be different.

vlada-shubina avatar Aug 26 '22 08:08 vlada-shubina

now.txt port.txt random.txt guid.txt

Usage stats of non-deterministic macro attached:

  • port - 78 templates
  • random - 11 templates
  • now - 51 templates
  • guid - 298 templates

vlada-shubina avatar Sep 06 '22 14:09 vlada-shubina

@vlada-shubina i should have a beta "verifydirectory" for you in a couple of days

SimonCropp avatar Sep 07 '22 01:09 SimonCropp

@vlada-shubina if you update to the current beta of Verify, you can try this https://github.com/VerifyTests/Verify#verifydirectory

SimonCropp avatar Sep 08 '22 09:09 SimonCropp

@vlada-shubina if you update to the current beta of Verify, you can try this https://github.com/VerifyTests/Verify#verifydirectory

Awesome work! Thanks. First very basic test passed.

The only obstacle I'm stuck with is here: https://github.com/VerifyTests/Verify/blob/33f951c84a2e08d5aed9743158abb88795f3067f/src/Verify/Splitters/Target.cs#L83-L86 The filenames can easily contain . in the name. Also subfolders do not work for the same reason as above.

update: if possible I would prefer different naming

- <testname folder>.received
  - content received goes here with unchanged names
- <testname folder>.verified
  - verified content goes here with unchanged names

Having verified and received mixed in the file name is a big confusing, see: image

vlada-shubina avatar Sep 08 '22 12:09 vlada-shubina

The filenames can easily contain .

i have deployed a new version with that constraint removed. I no longer makes sense anyway given the current file naming logic

if possible I would prefer different naming

hmm. let me think on that and get back to you

SimonCropp avatar Sep 08 '22 21:09 SimonCropp

@vlada-shubina how does this look https://github.com/VerifyTests/Verify/blob/main/docs/naming.md#usesplitmodeforuniquedirectory it is in the 18.0.0-beta.17 nuget

SimonCropp avatar Sep 11 '22 12:09 SimonCropp

@vlada-shubina how does this look https://github.com/VerifyTests/Verify/blob/main/docs/naming.md#usesplitmodeforuniquedirectory it is in the 18.0.0-beta.17 nuget

Thanks, it works great. The only improvement that I may think of is to do folder diff in diff tool, if it's available. For example: DiffMerge has a folder mode, and it's much more convenient to use that one instead of running multiple DiffMerge processes for individual files. CodeCompare supports it as well.

vlada-shubina avatar Sep 12 '22 15:09 vlada-shubina

Closing as work in context of 7.0 was done. Tracking the next iteration ideas in a new item: https://github.com/dotnet/templating/issues/5705

JanKrivanek avatar Nov 30 '22 07:11 JanKrivanek