Rock icon indicating copy to clipboard operation
Rock copied to clipboard

New Database Driven Unit Testing Support Code

Open cabal95 opened this issue 6 years ago • 7 comments

Feature Request

Describe the Feature Request

Running unit tests that require a real Rock database are currently extremely difficult. It requires the developer to maintain a test database, keep it up to date, and worse requires that all tests return the database back to it's original state. This last part can be at times impossible - for example if testing a Person Merge you may simply not be able to return the database back to it's original state without performing a full backup and restore. This means if you try to run such a test twice in a row it will fail the second time (and worse, other unrelated tests may fail).

Describe Preferred Solution

A support framework will be (has been) implemented that automates much of this process. The developer simply needs to decide if a new database will be setup for each and every test, or for the class (group) of tests. If the tests are able to clean up after themselves then a single database can be setup for the entire class. If the tests are not able to clean up after themselves then a database can be setup for each and every test in the class.

This is all done nearly completely automatically for the developer. They simply need to decide which method to use and implement two single-line methods that indicate which method they are using (per-test cleanup or per-class cleanup).

An example test-class that requires the database to be reset before each test would look like this:

    public class AttendanceCodeTests
    {
        #region Setup

        /// <summary>
        /// Runs before each test in this class is executed.
        /// </summary>
        [TestInitialize]
        public void TestInitialize( TestContext testContext )
        {
            DatabaseTests.ResetDatabase();
        }

        /// <summary>
        /// Runs after each test in this class is executed.
        /// </summary>
        [TestCleanup]
        public void TestCleanup()
        {
            DatabaseTests.DeleteDatabase();
        }

        #endregion

        #region Tests

        /* Individual tests go here */

        #endregion
    }

The alternative method, where you are able to perform your own cleanup after each test, would look like this:

    public class AttendanceCodeTests
    {
        #region Setup

        /// <summary>
        /// Runs before any tests in this class are executed.
        /// </summary>
        [ClassInitialize]
        public static void ClassInitialize( TestContext testContext )
        {
            DatabaseTests.ResetDatabase();
        }

        /// <summary>
        /// Runs after all tests in this class is executed.
        /// </summary>
        [ClassCleanup]
        public static void ClassCleanup()
        {
            DatabaseTests.DeleteDatabase();
        }

        /// <summary>
        /// Runs after each test in this class is executed.
        /// Deletes the test data added to the database for each tests.
        /// </summary>
        [TestCleanup]
        public void Cleanup()
        {
            /* Code to restore the database back to it's original state. */
        }

        #endregion

        #region Tests

        /* Individual tests go here */

        #endregion
    }

Database Data

The database data is stored in a ZIP file containing the MDF and LDF files from a "clean Rock install" database. The path to the file is indicated in the app.config and can either be a local path (e.g. C:\...) or, by default, a URL to a file on the web (e.g. http://...).

The default setting would point to a file that is for use on the develop branch and would need to be updated anytime a migration is committed. Technically it would depend on the migration. For example a migration that simply fixes a typo in a default block HTML content can probably be skipped. But a migration that modifies the structure of the database 100% for sure needs an updated database uploaded to the "main storage" of these.

A simple and quick way to generate this data would be to use the RockDevBooster application to deploy a new Rock instance from the specific migration commit. Once the instance is spun up you would login, use the Power Tools > Sample Data page to load the sample data, and then shut down the instance and zip up the MDF and LDF files (stored in the ~/App_Data folder). We could probably find a way to automate this even more if we need to.

The code uses caching so it will only download the ZIP archive from a remote server if it has actually changed so we shouldn't need to worry about it downloading 40MB zip files every single time we click Run Tests.

Needs

We need a central spot to store this "template database". The current database is just shy of 40MB. I'm not sure what options Spark has for storage or if we need to find a way to support these downloads as a community. But it would be ideal to have one file for each released Rock version, plus the develop branch. e.g.

This would allow us to quickly check out a specific Rock version and re-run the tests by just changing the URL in the app.config file.

Providing Code I will provide code if feature request is approved: Yes

@nairdo The commit that implements all this can be seen here for review and discussion: https://github.com/cabal95/Rock/commit/ead90dc9732e00ca6d6557e35444148beb6ce337

┆Attachments: GroupTests.cs

cabal95 avatar Aug 29 '18 00:08 cabal95

Looks good @cabal95. I'm looking forward to running this later tonight. I'm pretty sure we've got the storage resource you described above, although I'd like to change the folder/file name to match our current naming convention https://storage.rockrms.com/sampledata/integration-testing/database-1.8.0.zip.

Is the idea that we would then replace the default URL currently in the app.config with the storage.rockrms.com one:

<add key="RockUnitTestSource" value="https://storage.rockrms.com/sampledata/integration-testing/database-develop.zip" />

nairdo avatar Aug 29 '18 13:08 nairdo

Yep! So by default all tests in that DLL would use https://storage.rockrms.com/sampledata/integration-testing/database-develop.zip as the source data (again cached so it doesn't need to download it every single time). Though I get a cert error when trying to hit that domain on SSL.

The samples show just calling ResetDatabase() but individual test classes can also call ResetDatabase("http://...") or ResetDatabase("C:\...") to use a different source data for those tests if they are testing something specific and/or the cloud zip archive hasn't been updated yet.

cabal95 avatar Aug 29 '18 15:08 cabal95

@nairdo It looks like this code has been implemented, is that indeed the case?

cabal95 avatar Feb 06 '20 21:02 cabal95

@cabal95 No, not quite -- but perhaps partially. But we are about to refactor all the existing Rock.Tests from Xunit to MS Test, remove/move any db-context required tests into Rock.Tests.Integration and then rename it to Rock.Tests.UnitTest.

nairdo avatar Feb 07 '20 16:02 nairdo

Okay, sounds good. I'm playing around with a modification that allows the Integration test project to automatically build the "source zip file" rather than it having to be downloaded as that solves a couple issues - such as different versions of SQL installed.

cabal95 avatar Feb 07 '20 16:02 cabal95

Okay, so here is a summary of the changes I made on this branch: https://github.com/cabal95/Rock/commit/a64c9d4213a82b8f88b50f4672130b73bb461689

  1. The RockUnitTestSource setting is no longer used and can be removed.
  2. On the first call to ResetDatabase() it will create a new database and run all the migrations on it. That database will then be detached and zipped up and stored in the user's Temp directory.
  3. The normal RestoreDatabase(source) method will then be called to restore the database from that ZIP file.
  4. Additional calls to ResetDatabase() will first check for that ZIP file and if found use it instead of creating a new archive.

So basically, ResetDatabase() will only generate a new ZIP archive if it doesn't exist, or if a new migration is added to Rock. This means we don't have to keep a bunch of test database archive stored in the cloud anywhere for the test runner to download.

One thing to be aware of is that if you add a migration, then run the integration tests, then change the migration, a new archive will not be created and you would need to manually delete it from your AppData\Temp folder. This should be a freakishly rare occurrence as normally I think we verify the migration is all good before we would bother to run the integration tests.

Let me know if you want to talk about these changes or want me to turn it into a full PR or anything. I'm not sure what direction you guys want to go with the integration tests so this may not line up.

Timings

The very first initial run, meaning where it has to generate the archive, as expected takes some time. On my computer it takes around 4 minutes.

Once that archive is created, the very first call to ResetDatabase() takes about 30 seconds (this seems to be due to the fact it has to load a bunch of additional DLLs into memory).

Subsequent calls to ResetDatabase() take about 6 seconds.

SSD Wear

These changes also reduced the archive footprint. The code calls DBCC SHRINKFILE to shrink the data and log file down to 105MB and 4MB respectively for a savings of about 160MB, this means a faster database reset and less wear if you have an SSD. The compressed archive is now around 35MB.

Which is fantastic, because I've already written 8.68TB to my SSD so it only has about 434TB left until it dies. Which, BTW, equates to 3.8 million database restores. :)

LocalDb

Note that this change "requires" LocalDB so the concept of an app.ConnectionStrings.config goes out the window. Technically you could probably do it with a full SQL Server, but you would have to configure security properly so it has access to the MDF and LDF files generated. LocalDb automatically is installed with SQL so that isn't an issue, and since the MDF and LDF files are stored in the bin/Debug/Data folder, there shouldn't be any need to reconfigure that connection string on different computers.

cabal95 avatar Feb 07 '20 19:02 cabal95

@cabal95 Beautiful. Your changes are also working in my environment and I'm integrating them into our refactored feature-dl-update-test-projects branch via e282b1cc50054c1a3f92441557f260fc3b6b60cc.

nairdo avatar Mar 28 '20 04:03 nairdo

Over the years, this has been mostly implemented and then improved upon so I am closing.

cabal95 avatar Sep 07 '23 15:09 cabal95