swift-snapshot-testing
swift-snapshot-testing copied to clipboard
How to snapshot a view on macOS with provided screen scale?
When snapshot testing views on macOS recorded image size depends on the main display of the machine on which tests are run.
Example:
- I want to snapshot NSView of size 640x480 points
- When running a test on a MacBook with retina display, the snapshot image will have the size of 1280x960 pixels
- When running the same test on the same, but closed MacBook with external, non-retina screen connected, the snapshot image will have the size of 640x480 pixels
When snapshot testing views on iOS, I can provide a display scale trait collection to force the output image scale (as described in #427). Unfortunately, I haven't found a way to snapshot test macOS views, so the test results are not dependent on the display I am using (MacBook-embedded or external monitor).
I tried to work-around this issue with the code below:
extension Snapshotting where Value == NSViewController, Format == NSImage {
static func unscalledImage(precision: Float = 1) -> Snapshotting {
Snapshotting<NSView, NSImage>
.unscalledImage(precision: precision)
.pullback { $0.view }
}
}
extension Snapshotting where Value == NSView, Format == NSImage {
static func unscalledImage(precision: Float = 1) -> Snapshotting {
SimplySnapshotting<NSImage>
.image(precision: precision)
.pullback { $0.toImage().unscaled() }
}
}
private extension NSView {
func toImage() -> NSImage {
let cacheRep = bitmapImageRepForCachingDisplay(in: bounds)!
cacheDisplay(in: bounds, to: cacheRep)
let image = NSImage(size: bounds.size)
image.addRepresentation(cacheRep)
return image
}
}
private extension NSImage {
func unscaled() -> NSImage {
let image = NSImage(size: size)
image.addRepresentation(unscaledBitmapImageRep())
return image
}
func unscaledBitmapImageRep() -> NSBitmapImageRep {
let imageRep = NSBitmapImageRep(
bitmapDataPlanes: nil,
pixelsWide: Int(size.width),
pixelsHigh: Int(size.height),
bitsPerSample: 8,
samplesPerPixel: 4,
hasAlpha: true,
isPlanar: false,
colorSpaceName: .deviceRGB,
bytesPerRow: 0,
bitsPerPixel: 0
)!
imageRep.size = size
NSGraphicsContext.saveGraphicsState()
NSGraphicsContext.current = NSGraphicsContext(bitmapImageRep: imageRep)
draw(at: .zero, from: .zero, operation: .sourceOver, fraction: 1)
NSGraphicsContext.restoreGraphicsState()
return imageRep
}
}
And while it fixes the mismatch of the output image size (with the code above it will always equal to point-size of the view), it does not work as expected. It looks like scaling the image introduces minor distortion which causes test failures (for example, text on the snapshot image is slightly shifted 1 px).
Is there a way of forcing the scale of a snapshot image on macOS?
Hi @darrarski, did you get any further with this? I'm struggling with the same problem, but for me it's my laptop versus GitHub Actions.
Unfortunately, I didn't have time to work on this issue and research for a solution.
Scaling the images down seems a dead end. Anti-aliasing and font-smoothing result in inconsistent results.
As a workaround I unscale the MacBook Retina display with https://github.com/th507/screen-resolution-switcher before I take the snapshots, and revert it afterwards. For my MacBook 16", it's like
scres -s 3072
xcodebuild ... test
scres -r 1536
This works for my use case of GitHub Actions.
Wonder if anyone has any luck making output sharper from macOS snapshots?