Incorrect scale factor from Screen API on retina mac with non-default scaling
I'm using the screen API to detect screen dimensions and scale factor in my app. While testing on a retina macbook, I noticed errors in scale factor whenever I change to a non-default resolution.
Here's my code:
<!DOCTYPE html>
<head>
<script type="text/javascript">
var gui = global.window.nwDispatcher.requireNwGui();
gui.Screen.Init();
var screens = gui.Screen.screens;
for (var i = 0, len = screens.length; i < len; i++) {
var s = screens[i];
console.log("screen " + i + ": " + s.bounds.width + " x " + s.bounds.height + " @ " + s.scaleFactor);
}
</script>
</head>
</html>
The macbook pro 13-inch retina has four available resolutions:

If I select the default resolution, then I get the correct results logged to the console:
screen 0: 1280 x 800 @ 2
But any of the other three resolutions yield incorrect results:
screen 0: 1024 x 640 @ 2
screen 0: 1440 x 900 @ 2
screen 0: 1680 x 1050 @ 2
The resolutions are correct, but the scale factor is wrong. The correct results should be:
screen 0: 1024 x 640 @ 2.5
screen 0: 1440 x 900 @ 1.778
screen 0: 1680 x 1050 @ 1.524
Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.
What does window.devicePixelRatio give you?
Unfortunately, window.devicePixelRatio gives me the same results as screen.scaleFactor: no matter the resolutions it's always 2.0
I can reproduce this issue on nwjs-sdk-v0.24.0.
still not solved? :/ I encountered this bug while using electron, so this bug is basically part of javascript and not external libraries... anybody has a workthrough?
This is still a bug in 2025