High DPI scaling on Mac OS and glfw
Version/Branch of Dear ImGui:
Version: 1.87, 1.88 WIP Branch: docking & viewport
Back-end/Renderer/Compiler/OS
Back-ends: imgui_impl_glfw.cpp + imgui_impl_opengl3.cpp Compiler: clang Operating System: Mac OS Monterey, Mac Mini M1
My Issue/Question:
Hello,
I know the questions about high DPI scaling have been asked a lot around here, however I do not think that I found a solution in the github issues.
My goal is to have an application that correctly DPI scaling. So I am following the instructions shown here. In the main loop, I checking if the scaling of the main window changed and I rebuild the fonts accordingly:
[...] // In main.cpp of example_glfw_opengl3
float prev_scale = 0.f;
while (!glfwWindowShouldClose(window))
{
[...] // Event processing
float xscale, yscale;
glfwGetWindowContentScale(window, &xscale, &yscale);
if (xscale != prev_scale) {
prev_scale = xscale;
io.Fonts->Clear();
io.Fonts->AddFontFromFileTTF("Roboto-Regular.ttf", xscale * 16.0f);
io.Fonts->Build();
ImGui_ImplOpenGL3_DestroyFontsTexture();
ImGui_ImplOpenGL3_CreateFontsTexture();
// ImGui::GetStyle().ScaleAllSizes(xscale);
}
// Start the Dear ImGui frame
ImGui_ImplOpenGL3_NewFrame();
ImGui_ImplGlfw_NewFrame();
[...] // Rest of the code of main.cpp of example_glfw_opengl3
}
So, this solution works fine on windows, and my fonts are not blurry and are correctly scaled.
However, on Mac OS the window of glfw is by default already scaled, so if I scale the fonts, then I get this:

One quick solution would be to not scale the font at all, but then I obtain text that is at the correct scale, but blurry:

My first initial guess was to look at the glfw window hints, like GLFW_SCALE_TO_MONITOR and GLFW_COCOA_RETINA_FRAMEBUFFER. The first hint does not seem to change anything, and the second hint does not help either, because it is GLFW_TRUE by default.
With GLFW_COCOA_RETINA_FRAMEBUFFER set to GLFW_TRUE, the framebuffer size of the main window
int display_w, display_h;
glfwGetFramebufferSize(window, &display_w, &display_h);
returns the actual number of pixels the window takes, not virtual pixels. This means on my 4K screen, if I fullscreen the main window, display_w will have 3860 pixels.
Searching further, I found that #3757 exposes a similar problem to mine, but with the SDL. The author of the issue provided a temporary fix. I tried to implement the fix in imgui_impl_glfw.cpp like this:
void ImGui_ImplGlfw_NewFrame()
{
ImGuiIO& io = ImGui::GetIO();
[...]
if (bd->WantUpdateMonitors)
ImGui_ImplGlfw_UpdateMonitors();
// Fix
#if defined(__APPLE__)
// On Apple, The window size is reported in Low DPI, even when running in high DPI mode
ImGuiPlatformIO& platform_io = ImGui::GetPlatformIO();
if (!platform_io.Monitors.empty() && platform_io.Monitors[0].DpiScale > 1.0f && display_h != h)
{
io.DisplayFramebufferScale = ImVec2(1.0f, 1.0f);
io.DisplaySize = ImVec2((float)display_w, (float)display_h);
}
#endif
[...] // rest of the function
}
This does, along with the scaled font, produces the result I want:

However, the mouse interaction does not work, because ImGui thinks that the mouse coordinates are in virtual pixels (I use glfw terms), but the UI is rendered in true pixels. I looked at
void ImGui_ImplGlfw_CursorPosCallback(GLFWwindow* window, double x, double y)
{
ImGui_ImplGlfw_Data* bd = ImGui_ImplGlfw_GetBackendData();
if (bd->PrevUserCallbackCursorPos != NULL && window == bd->Window)
bd->PrevUserCallbackCursorPos(window, x, y);
ImGuiIO& io = ImGui::GetIO();
if (io.ConfigFlags & ImGuiConfigFlags_ViewportsEnable)
{
int window_x, window_y;
glfwGetWindowPos(window, &window_x, &window_y);
x += window_x;
y += window_y;
}
io.AddMousePosEvent((float)x, (float)y);
bd->LastValidMousePos = ImVec2((float)x, (float)y);
}
but I failed to provide a valid fix, because glfw provides virtual pixels positions (for both the callback and glfwGetWindowPos). This means if I have a 3680x2160px screen and a scaling of 2, with a mouse position of 400x200 the callback will return 200x100.
Multiplying the mouse position by the scale like in #3757 does not work, and this is because glfw always provides virtual pixel coordinates.
Do you know how I could resolve this ? I found a way to have the mouse coordinates match the window like this (with the viewport flags):
int window_x, window_y;
glfwGetWindowPos(window, &window_x, &window_y);
x += window_x;
y += window_y;
float xscale, yscale;
glfwGetWindowContentScale(window, &xscale, &yscale);
x *= xscale;
y *= yscale;
x -= window_x;
y -= window_y;
but this is obviously wrong because even if I can interact correctly with the widgets inside the main window, it means that I can also interact with widget when the mouse is outside the window...
Thank you in advance
My solution is change draw_data->FramebufferScale at function SetupViewportDrawData original code is: draw_data->FramebufferScale = io.DisplayFramebufferScale; modified code as: draw_data->FramebufferScale = ImVec2(floor(viewport->DpiScale), floor(viewport->DpiScale)); This is only workround, and I test sdl/glfw at Mac and Windows.
Some related discussion: https://bitbucket.org/wolfpld/tracy/issues/42/cannot-scroll-down-capture-on-macos
#5301 also a relevant issue with a workaround.
I experienced this issue with GLFW and Wayland. I worked-around it with this patch: https://gist.github.com/TheBrokenRail/9ed21b810a4f33a5b1bb062024573128
It:
- Sets
DisplaySizetoglfwGetFramebufferSizeinstead ofglfwGetWindowSize - Leaves
DisplayFramebufferScaleat the default of[1.0f, 1.0f] - Converts mouse-coordinates from window-pixels to framebuffer-pixels (see
ImGui_ImplGlfw_ScaleMousePos) - Updates the GLFW+OpenGL2 example to scale the font size by
glfwGetWindowContentScale- In a proper fix, this should be checked every frame in-case the DPI changes. My patch only does it on startup.
- It should also call
ImGui::GetStyle().ScaleAllSizes
This works on Wayland, where just like macOS, "screen coordinates can be scaled relative to pixel coordinates" (to quote the GLFW documentation). It also works on X11, where just like Windows, "screen coordinates and pixels always map 1:1" (also quoting GLFW).
Sets
DisplaySizetoglfwGetFramebufferSizeinstead ofglfwGetWindowSizeConverts mouse-coordinates from window-pixels to framebuffer-pixels (seeImGui_ImplGlfw_ScaleMousePos)
That sounds about right, and is more or less what I did in a (non-imgui) work project to get hidpi on macos.
I am hoping to tackle this on Q1 2025, but yes that's basically it above. For "(3) Convert Mouse Coordinates" i am hoping to add general input transform support, perhaps in the backend perhaps in core imgui. There are other valid reasons to do that transform (e.g. try to keep all imgui coordinates positive).
The more tricky aspect is to understand how coordinates and transform would work in multi-viewport multi-monitor mode. If you have access to a Mac setup where one monitor is hi-dpi and the other isn't, it would be the ideal test-bed and I'd be happy if you can report on this. Not having investigated this is the primary blocker for me.
EDIT The transform would also generally apply when interacting with the windowing system.
I thought about this a bit (with the background of having a working solution at least for the non-viewport or non-mixed-DPI case: https://github.com/ocornut/imgui/issues/6967#issuecomment-2833882081).
As this issue is a bit older and some issues mentioned here are already fix, a recap of the current state:
As @TheBrokenRail mentioned before, there are two approaches for display-scaling/High-DPI, implemented by different operating/windowing systems:
- Having a high-res framebuffer in physical pixels, but window sizes/coordinates in logical points that would correspond to pixels on a low-DPI screen. This is what macOS and Wayland do.
- Both framebuffer and window coordinates are in high-resolution physical pixels, but a scaling factor is provided so know how much to scale your UI manually. This is what Windows and X11 do (at least with KDE?).
At least in current ImGui, the macOS / Wayland case just works - except that the fonts are blurry. ImGui does everything in logical points and for rendering scales that up by io.DisplayFramebufferScale = framebuffer_size_in_pixels / window_size_in_points.
The blurry fonts can be mostly prevented by setting font_cfg.RasterizerDensity = framebuffer_size_in_pixels.y / window_size_in_points.y; (or .x or max of both values, usually the ratio is the same in both directions anyway).
("Mostly" because if that ratio is no integer, the actually loaded font size - font_size * RasterizerDensity - might not be an integer. Either way, much better than without setting the RasterizerDensity)
Windows / X11 look too small as they are not automatically scaled (because there framebuffer_size_in_pixels == window_size_in_points), but can be scaled manually be increasing the fontsize accordingly and calling style.ScaleAllSizes(scaleFactor);.
This has the advantage that you have total control over the fontsize and can ensure it's really (loaded as) an integer size.
It has the disadvantage that style.ScaleAllSizes(scaleFactor); is kind of a PITA to work with, because:
- you need to reset the style before calling
ScaleAllSizes()again - editing such a scaled-up style is useless (as you edit the upscaled values, but you need to store/set the original unscaled values when loading a style, esp. if it should also work with other scale-factors)
I generally prefer the Windows / X11 model, for example because it doesn't break older games.
But thinking about it from the multi-viewport with mixed DPI point of view, the model of working with logical points that are scaled (according to the current display's DPI) when rendering seems much better - and it also circumvents the issues with ScaleAllSizes().
I think being able to use ImGui without having to think about the current displays DPI (and thus the scale-factor you should apply manually) is the only sane way to support having one application with multiple windows that can be on different screens - especially when you don't explicitly create an ImGui window in a different platform window, but having ImGui automatically create additional platform windows when dragging ImGui windows out of an existing platform window (like the Docking-branch does).
I think for that to work ImGui would have to:
- Change the way fonts are handled, because it means that when the user loads one font in one size, it would automatically get loaded in multiple sizes, with an appropriate RenderDensity applied automatically.
ImGui would need to select the correct version depending on which platform window/ImGuiViewportit currently renders in (disclaimer: I haven't looked at thedynamic_fontsbranch yet, maybe it already supports something like this?).
By the way, ImGui automatically applying the DPI-scale as rasterizer-density would have the additional advantage that it could round the resulting size to the next integer instead of loading the font with non-integer sizes as it can currently happen (this could be done independently of all the other suggestions ;)). - Use the macOS/Wayland-model of logical coordinates everywhere. Probably not too far from the general input transforms @ocornut mentioned, or would at least build on that. So on Windows or X11-like platforms everything should be done in logical points that have a lower resolution, which means that some input values (like mouse coordinates) need to be scaled down accordingly in the input backends (this doesn't require any
#ifdefs, just some calculating of factors based onframebuffer_size_in_pixels,window_size_in_pointsandsuggested_scale_factor_from_system) - Remove
io.DisplayFrameBufferScale, use viewPort.DpiScale instead.
Becauseio.DisplayFrameBufferScale, would become a per-platform-window property - butImGuiViewportalready hasDpiScaleand thanks to 2., DisplayFrameBufferScale and DpiScale are now the same on all platforms (before it only was the same on macOS/Wayland-like platforms) - at least ifDpiScaleis the scale communicated by the system and not configurable, I haven't looked at it closely, just noticed it's there and looks like the right thing. If it does something else, just moveDisplayFrameBufferScaletoImGuiViewport, I guess.
Having a high-res framebuffer in physical pixels, but window sizes/coordinates in logical points that would correspond to pixels on a low-DPI screen. This is what macOS and Wayland do.
Both framebuffer and window coordinates are in high-resolution physical pixels, but a scaling factor is provided so know how much to scale your UI manually. This is what Windows and X11 do (at least with KDE?).
In fact, how the pixel size (1:1) vs logical size (1.0 DPI) is handled depends on the rendering backend. The distinction you have outlined above is true for the basic OS layer, but not necessarily for how the applications need to operate. In Tracy I have Windows (glfw) and Wayland (custom one) backends that render everything at 1:1, and this is the correct way for me. I need my 1 px lines to be 1 px. I need to be able to show images that are not garbled.
X11 is obsolete and doesn't have the notion of DPI scaling. The macOS implementation in ImGui, as I understand from my investigation some time ago, has some makeshift hacks in it that prevent it from working correctly. Getting a 1:1 (-ish, as Apple only support integer scales) rendering working in my custom SDL-based engine was relatively easy. There's a hi-dpi flag to set and you have real and logical window sizes that you need to handle, and that's basically it.
macOS backend in Tracy (glfw) is broken and people have been complaining about things like missing scroll bars, but what can you do. The way I see fixing it is by removing the currently existing macOS specific code paths in ImGui to get the same behavior everywhere, and then you can look at what backends are doing, in order to get it right.
At least in current ImGui, the macOS / Wayland case just works - except that the fonts are blurry.
I'd say it doesn't work then ;p
Windows / X11 look too small as they are not automatically scaled
Windows applications need to explicitly state that they handle DPI scaling. If they don't, they render at the base DPI and the OS scales the content to the correct DPI (of course resulting in a blurry window). If the program states it handles DPI, but doesn't, well, the window contents being too small are programmer's error.
But thinking about it from the multi-viewport with mixed DPI point of view, the model of working with logical points that are scaled (according to the current display's DPI) when rendering seems much better
This basically replicates the Wayland headache of "how do I display an image where image pixels are monitor pixels and not some blurred mess" and "oh no, how it's going to work with fractional scaling".
I think being able to use ImGui without having to think about the current displays DPI (and thus the scale-factor you should apply manually) is the only sane way to support having one application with multiple windows that can be on different screens
You don't need multiple windows to have multiple DPIs. Your window can already be on two (or more) monitors that have different DPI. In Windows this is handled in a quite horrible way, with a part of the window being always too big or too small on the other monitor, and the window changing its size when its active DPI changes (determined by which monitor displays the largest part of the window). KDE handles this much better, as the size of window on monitors stays the same, with just the contents becoming blurry or aliased on the other monitor.
In fact, how the pixel size (1:1) vs logical size (1.0 DPI) is handled depends on the rendering backend. The distinction you have outlined above is true for the basic OS layer, but not necessarily for how the applications need to operate. In Tracy I have Windows (glfw) and Wayland (custom one) backends that render everything at 1:1
Sure, custom backends can do whatever, but the default behavior with SDL or GLFW is that on macOS and Wayland you get window size and mouse coordinates in logical points, but a framebuffer to render in in physical pixels. Maybe you can somehow (with window flags) get a framebuffer where pixel == point, but then you render in lower resolution and it gets scaled up and that's ugly.
At least in current ImGui, the macOS / Wayland case just works - except that the fonts are blurry.
I'd say it doesn't work then ;p
Well, compared to the state of the original post above, where "the mouse interaction does not work, because ImGui thinks that the mouse coordinates are in virtual pixels", it works well - the sizes are right and mouse input works as expected. The blurry fonts can be easily fixed, as I described.
Windows applications need to explicitly state that they handle DPI scaling. If they don't, they render at the base DPI and the OS scales the content to the correct DPI (of course resulting in a blurry window). If the program states it handles DPI, but doesn't, well, the window contents being too small are programmer's error.
Could be, but the default blurry behavior is definitely not what you want (and not what you get with glfw or SDL3) so in the end, as I said (and you described as "the window contents being too small are programmer's error"), you need to scale manually.
But thinking about it from the multi-viewport with mixed DPI point of view, the model of working with logical points that are scaled (according to the current display's DPI) when rendering seems much better
This basically replicates the Wayland headache of "how do I display an image where image pixels are monitor pixels and not some blurred mess" and "oh no, how it's going to work with fractional scaling".
Yes, rendering images pixel-exact requires you to do manual scaling (rendering with pixel_size/dpi_scale). Better than having to scale everything else manually?
Your window can already be on two (or more) monitors that have different DPI. In Windows this is handled in a quite horrible way, with a part of the window being always too big or too small (...) KDE handles this much better, as the size of window on monitors stays the same, with just the contents becoming blurry or aliased on the other monitor
Well, both behaviors suck in a way, but if the user places a window across two screens with different DPI, that's their problem. But if the application has multiple platform-windows that are on different screens that can be made to work well, so IMHO that's what ImGui should aim to do.
Also:
macOS backend in Tracy (glfw) is broken and people have been complaining about things like missing scroll bars, but what can you do. The way I see fixing it is by removing the currently existing macOS specific code paths in ImGui to get the same behavior everywhere, and then you can look at what backends are doing, in order to get it right.
In https://github.com/DanielGibson/texview/ I use glfw and OpenGL on all platforms (Linux and similar with X11 or Wayland, Windows, macOS) and it works (and looks pretty good) on all platforms, with and without desktop scaling (even fractional). I have no platform-specific code for rendering or ImGui.
I don't use the approach I suggested here (hadn't thought of this when I implemented it, and I don't use the docking branch there so it's not needed anyway), but still have a unified codepath for everything. The important part is this function: https://github.com/DanielGibson/texview/blob/762de2b76200515b6ed785f492003c4b9728a41b/src/main.cpp#L1345
I guess what I'm doing there should also work just as well with the docking branch, as long as no displays with different DPIs are involved - but as far as I understand that case is unsolved anyway (and requires changes to ImGui itself)