swift-snapshot-testing
swift-snapshot-testing copied to clipboard
Precision checking algorithm is very slow
Describe the bug When snapshot testing images with precision lower than 1, if the snapshots don't exactly match the reference images, the tests take much longer to run.
We currently have 66 snapshot tests in a framework
Precision | Passing Tests | Failing Tests | Pixel-Imperfect Images | Total Time |
---|---|---|---|---|
1 | 57 | 9 | 36 | 38s |
0.97 | 66 | 0 | 36 | 483s |
Those 9 failing unit-tests are recording 4 images each to account for different device sizes, so those 36 images that are not pixel-perfect are responsible for a 12.7x increase in the total duration for running 66 unit tests. The recorded images are not very large, ranging from 81KB to 255KB.
To Reproduce Set the precision to any value lower than 1 and modify the tested view so they're not pixel-perfect with the reference images anymore.
Expected behavior Checking whether or not the snapshots are within the accepted precision should be faster. A 12.7x increase for just 36 images is a whole different order of magnitude.
Environment
- swift-snapshot-testing version 1.9.0
- Xcode 13.2.1
- Swift 5.5
- iOS 15.2
I believe the issue is just this for-loop, but I don't have a concrete suggestion for how to fix it. Maybe parallelize it with a multi-threaded queue?
https://github.com/pointfreeco/swift-snapshot-testing/blob/88f6e2c0afe04221fcfb1601a2ecaad83115a05f/Sources/SnapshotTesting/Snapshotting/UIImage.swift#L105-L108
FYI: There's a promising open PR (#571) that aims to address three issues involving that loop, including its surprisingly long execution time (when compiled without optimizations, as is default for "Debug" builds).
Another option would be to move the image diff analysis from the CPU to the GPU. The package already includes functions to create diff images (which are part of the failure output), so throwing something like the CIAreaAverage Core Image filter on it would allow for very quick evaluation.
The problem solved by https://github.com/pointfreeco/swift-snapshot-testing/pull/571 could also be tackled by additionally running CIAreaMaximum to see whether any of the pixels is above a certain threshold.
The use of CoreImage filters in https://github.com/pointfreeco/swift-snapshot-testing/pull/628 improves the speed by 90-97%: going from ~1s to ~2ms per snapshot. This is accomplished using the CILabDeltaE filter, but could also be done with a CIColorAbsoluteDifference to calculate the Euclidean color distance instead of a perceptual color difference.
@ejensen I'm working on a few tests and the precision makes the tests super slow e.g a test takes 0.5s to complete but with 0.98
precision it takes 6s on the latest version.
Edit: It's faster if I also add a preceptualPrecision
of 0.98
with precision
@ejensen I'm working on a few tests and the precision makes the tests super slow e.g a test takes 0.5s to complete but with
0.98
precision it takes 6s on the latest version.Edit: It's faster if I also add a
preceptualPrecision
of0.98
withprecision
The speedup added in #628 was only enabled when preceptualPrecision < 1
. I have added another https://github.com/pointfreeco/swift-snapshot-testing/pull/664 that introduces a similar speedup when preceptualPrecision == 1