Retina mode does not report the native display resolution to a virtual machine
Describe the issue
I'm running my M3 Air (native res: 2560x1664) at a display resolution of 1710x1112.
When I install spice-vdagent, fullscreen the VM, and run xrandr --output Virtual-1 --auto, xrandr reports a resolution of 1710x1069.
If I check the "retina" checkbox and perform the same actions, I get a resolution of exactly 2x, or 3420x2138.
Neither of these are the native resolution of my M3 Air.
This impacts integer scaling: it seems as though UTM is operating on the fake, 1710x1112 resolution when performing integer scaling, as it results in uneven pixelation at 1280x768, which should be nearest-neighbor upscaled cleanly (2x) to a width of 2560.
Configuration
- UTM Version: 4.5.3 (99)
- macOS Version: 14.6.1
- Mac Chip (Intel, M1, ...): M3 Air
So I am learning about macOS rendering but my understanding is the following:
backingScaleFactorreturns the scale of the framebuffer NOT the physical scale factor. See: https://stackoverflow.com/a/54487166 This seems to always be 1.0 or 2.0.- The way macOS rendering works is that in any retina mode, the backing store is always 2.0 and then the OS will re-scale it to the correct physical dimensions.
- It seems like it is impossible to avoid macOS's scaler: https://stackoverflow.com/questions/53551692/how-to-render-each-pixel-of-a-bitmap-texture-to-each-native-physical-pixel-of-th#53563363 even if you force the drawable to be the right resolution, it will end up getting re-scaled to 2x and back down to 1.5x
So unless someone has a better way, I don't think it is possible to fix this from UTM...