panda3d
panda3d copied to clipboard
VSync does not work on Linux with NVidia drivers
Follow-up of https://discourse.panda3d.org/t/vr-with-panda3d-panda3d-openvr/26197/2
It seems that when using the proprietary NVidia driver on Linux, the sync-video flag in the PRC file has no impact on the actual frame synchronisation with the refresh rate of the monitor. The app follows the setting of "Sync to VBlank' of the NVidia control panel.
Two possible workarounds :
- Create an application profile in the NVidia control panel, but it's not end-user friendly.
- Set the environment variable __GL_SYNC_TO_VBLANK to 0 or 1 when launching the app (e.g.
os.environ['__GL_SYNC_TO_VBLANK'] = "0"
before instancing ShowBase)
This is unfortunate, but unless there is some other API, I don't know that it's possible to fix this. We set a hint and it's up to the driver to honour it or not.
Does VSync work in other OpenGL applications, such as SDL-based ones?
I have to force vsync off with MESA OpenGL (Intel and AMD) using the vblank_mode
environment variable. However, I never tried to hunt down why sync-video
didn't work for me. It could very well be an environment or compositor issue overriding application preferences and forcing vsync.
Does nvidia-settings on Linux have a "let application decide" type setting for vsync?
Does VSync work in other OpenGL applications, such as SDL-based ones?
I don't have many OpenGL apps that allows to check the fps, but it seems that they simply live with what the driver do.
Does nvidia-settings on Linux have a "let application decide" type setting for vsync?
It had in the past, but they seems to have changed at some time to either v-sync or 'full speed'. I never noticed/cared until now to be honest.
Maybe we should be setting __GL_SYNC_TO_VBLANK in Panda? I'm a little unsure because SDL doesn't do it either (it evidently suffers from the same problem).
Is this problem mostly for X11? Do we know if it affects Wayland too (mostly for SDL since Panda goes through xwayland). I know eliminating screen tearing on X11 can be a real pain, so it does not surprise me that drivers have shifted to "all on" or "all off" in regards to vsync. Taking a look at MESA/DRI, the vblank_mode env var appears to still allow application control: https://dri.freedesktop.org/wiki/ConfigurationOptions/. However, as I noted earlier, I have to set this to 0 despite what sync-video is set to.
Since we don't have control on the drivers and the possible changes in the future, maybe it's for the better to close this as wontfix and document in the manual that an app should do extra steps if it wants to to configure itself the v-sync mode ?
I am not opposed to that approach, but I think I would like to better understand the problem. In particular, how vsync is handled by various drivers (specifically proprietary Nvidia, open source AMD, and Intel) across Wayland and X11. My general fallback for windowing is to follow SDL, so better understanding what they do would be helpful too.
This gives some more info about this behaviour of the NVIDIA driver: https://download.nvidia.com/XFree86/Linux-x86_64/352.79/README/openglenvvariables.html
Maybe deploy-ng should set this based on the value of sync-video in the embedded configuration, if it detects that the environment variable isn't already set by the user?
Does it actually work to set this environment variable in the process itself or does it need to be set before launching the process?
(As an aside, that page also mentions a __GL_THREADED_OPTIMIZATIONS
variable, which looks like it could offer interesting performance benefits to Panda3D, which does not rely on synchronous OpenGL calls, and uses X11 in thread-safe mode. We could consider setting it in deploy-stub or x11GraphicsPipe as well.)
Here's another thought: there's a number of additional swap-related GLX extensions that NVIDIA drivers do support, such as GL_NV_swap_group. I wonder if the driver does respect sync-video when we use those extensions?