swift-snapshot-testing icon indicating copy to clipboard operation
swift-snapshot-testing copied to clipboard

Snapshot taken on CI has slight color differences compared to local

Open edulpn opened this issue 4 years ago • 15 comments

Hi everybody, I'm having an issue here with color discrepancies in an image view. I'm using KIF framework for UI Tests and SnapshotTesting to take screenshots between interactions (asserts made on the app key window). Everything works fine, except that I get color differences from snapshots taken on my local machine vs snapshots taken on CI machine (both running the same stack, Mojave + Xcode 11.3).

Some examples below:

This one is taken by CI machine asdf

This one is taken by my local computer testOpenAndClose 1

This is the comparison of both images made by imagemagick result

And these are the color discrepancies of a same pixel, measured by macOS Digital Color Meter WhatsApp Image 2020-03-17 at 10 11 38 WhatsApp Image 2020-03-17 at 10 11 22

I know there are lots of variables and also using KIF, but do you guys have any idea of what might be causing this difference?

Thanks!

edulpn avatar Mar 17 '20 13:03 edulpn

Just to chime in that we're seeing the same issue. When run our snapshot tests from fastlane, assets (pngs etc) seem to be slightly differently rendered compared when we launch the same tests on the same machine, same simulator, from Xcode. Very strange.

CleanShot 2020-04-05 at 10 16 09

iandundas avatar Apr 05 '20 08:04 iandundas

I'm not seeing any difference between fastlane and Xcode on my machine so far, but I do also experience tiny nuances between my machine's and CI's snapshots, specifically on iOS 13. Having a non-zero corner radius in a native view is enough to have a difference, don't really need to involve assets.

edit: including screenshots now. What scares me is that this tiny difference is consistently so, on both.

Mine CI
mine ci

if there's no workaround for that other than lowering the precision, I wonder if maybe it'd make sense to have a tolerance specifically for cases like that where the R and/or G and/or B channel(s) have a difference of only +-1...

edit 2: on iOS 12, both 2x and 3x screens tests pass, on iOS 13 both fail.

gobetti avatar Apr 23 '20 03:04 gobetti

We're seeing similar results between different machines, as soon as we add "real life" images we get a small imperceptible differences causing the test to fail.

aegzorz avatar Apr 23 '20 20:04 aegzorz

@aegzorz could you please check if your differences are only on iOS 13? And how much is the difference? (i.e. what's the RGB for the different pixel in the two different runs)

gobetti avatar Apr 23 '20 20:04 gobetti

We're only testing on iOS 13 currently. Seems using shadows on views throws it off sometimes as well. Haven't done any RGB measurements yet.

aegzorz avatar Apr 23 '20 20:04 aegzorz

sadly, lowering the precision or considering as "equal" pixels with an invisible difference aren't performant solutions. Tests that don't pass the memcmp comparison are orders of magnitude slower so having to go through the bytes comparison loop is probably very undesirable.

gobetti avatar Apr 23 '20 21:04 gobetti

We have this in our README:

⚠️ Warning: Snapshots must be compared using a simulator with the same OS, device gamut, and scale as the simulator that originally took the reference to avoid discrepancies between images.

Is it possible to ensure that CI and developer machines test in a consistent environment?

stephencelis avatar May 06 '20 19:05 stephencelis

Could be something there, but we were seeing it on the exact same simulator (model, OS etc) on the same machine, with the difference being only between running the tests from Xcode vs running them (locally) from fastlane

iandundas avatar May 06 '20 20:05 iandundas

Huh. Perhaps someone with fastlane experience can explain? Maybe it's worth opening an issue with them to ask?

stephencelis avatar May 06 '20 20:05 stephencelis

Good idea, will do 👍🏻

iandundas avatar May 06 '20 20:05 iandundas

I was just about to, when I realised above that some people are saying it is not fastlane which is causing this issue for them (but simply running the tests on a different machine). So a bit hesitant now to open that fastlane issue 🤔.

iandundas avatar May 15 '20 11:05 iandundas

@iandundas it could be related to https://github.com/pointfreeco/swift-snapshot-testing/pull/288

gscalzo avatar May 16 '20 14:05 gscalzo

For future readers I faced this issue with different colours on different machines. It happened when I tried to solve another problem with masked corner radius. For solving it I had to add host application to test module and enable a parameter called DrawOnViewHierarchy in .image strategy. I removed them and then I recorded the snapshots tests for iPhone 8 14.4. Then I forced fast lane to run tests on this simulator on CI and everything works great with 100% precision!

To me the problem was that after Xcode 11 simulators use GPU to render things and not CPU.

Nikoloutsos avatar Jun 28 '21 21:06 Nikoloutsos

Hi guys,

I'm curious if there are any other solutions to this problem rather than hosting the test bundle inside the app executable? Unfortunately, this doesn't work in my case.

I've been looking into this problem for some time and here are my thoughts:

My understanding might not be completely correct, but I think that this problem might occur due to the fact that some CI (mostly cloud) runners would not have any graphic modules at all, so the entire rendering would have to happen on the CPU, versus GPU that is used when we are running tests via Xcode (>= v11.0). I also know that there is a difference in how rendering works for the simulators that run in the headless mode vs the ones that we see on screen. So even if the simulated device (and the machine) is the same, there is still a high chance of running into this issue, while simply launching tests in different ways.

I've confirmed this on my local machine by recording snapshots via xcodebuild and then verifying those by running the tests from Xcode and vice versa. In both cases, there were insignificant differences between the reference and the result images which urged tests to fail, given that precision was intact.

I have a suspicion that reference images recorded via the fastlane (same xcodebuild) would have a better chance of matching the results received on CI. However, the developer experience from having to use xcodebuild to record the reference images would be worse than just using Xcode, I also reckon that there is still quite a high chance for those tests to be flaky.

Another observation that I've made is that in most of the cases the difference was in how assets (icons, images, etc) are being rendered, as the rest of the components were quite identical. I'm curious if anyone has explored how the use of different UIImage.RenderingMode and UIGraphicsRendererFormat could affect the result of running snapshot tests?

I would be keen on having a discussion on this matter here

AlexApriamashvili avatar Jan 04 '22 01:01 AlexApriamashvili

It is still an issue for Xcode 13.3.1 (iOS 15.4 simulator). In my case reference image is created on M1 mac, Github action generated sightly different images.

bill-florio avatar May 11 '22 14:05 bill-florio

https://github.com/pointfreeco/swift-snapshot-testing/pull/628 should resolve this issue with a new perceptual difference calculation. The tiny differences in rendering are generally under the 2% DeltaE value which is nearly imperceivable by the human eye. Using a perceptualPrecsion value of >=98% will prevent imperceivable differences from failing assertions while noticeable differences are still caught.

ejensen avatar Sep 12 '22 14:09 ejensen

We have this same issue where one of our 6 CI machines fails 1-6 snapshots of our 500ish snapshots. We just changed everything to pass in perceptualPrecision: 0.98 and we're still seeing the issue.

Here's the difference that shows up in Xcode: difference_3_FD4AAF4D-CE3A-4368-AEFD-9E873B5258D5

The wild thing is all these machines are setup via script and I don't understand how they could be different. The color profile angle sounded nice, but we don't have any ~/Library/ColorSync directory. Looking at the color profiles via image magic shows 8-bit sRGB for everything so it seems like it's the same.

KingOfBrian avatar Sep 26 '22 19:09 KingOfBrian

We have this same issue where one of our 6 CI machines fails 1-6 snapshots of our 500ish snapshots. We just changed everything to pass in perceptualPrecision: 0.98 and we're still seeing the issue.

The wild thing is all these machines are setup via script and I don't understand how they could be different. The color profile angle sounded nice, but we don't have any ~/Library/ColorSync directory. Looking at the color profiles via image magic shows 8-bit sRGB for everything so it seems like it's the same.

I found some cases where some machines, particularly virtualized environments, where the snapshot images are taken in a different color space than when the same test is run on a different machine with the exact same OS/simulator. I put together https://github.com/pointfreeco/swift-snapshot-testing/pull/665 which attempts to normalize both the reference and new snapshot image's color space before comparison.

ejensen avatar Oct 22 '22 01:10 ejensen