VRAM memory leak when using unredir-if-possible on certain software
Platform
Arch-Linux
GPU, drivers, and screen setup
NVidia GTX 1650S & 1060, nvidia drivers 525.60.11, Xorg running on 1650S, three monitors configured side-by-side with xrandr
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 3072 MB
Total available memory: 3072 MB
Currently available dedicated video memory: 3003 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce GTX 1060 3GB/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 525.60.11
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 525.60.11
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 525.60.11
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
Environment
i3-gaps version 4.21.1
picom version
vgit-98a5c
Diagnostics
Extensions:
- Shape: Yes
- XRandR: Yes
- Present: Present
Misc:
- Use Overlay: No (Another compositor is already running)
- Config file used: /home/salt/.config/picom.conf
Drivers (inaccurate):
NVIDIA
Backend: glx
- Driver vendors:
- GLX: NVIDIA Corporation
- GL: NVIDIA Corporation
- GL renderer: NVIDIA GeForce GTX 1060 3GB/PCIe/SSE2
Backend: egl
- Driver vendors:
- EGL: NVIDIA
- GL: NVIDIA Corporation
- GL renderer: NVIDIA GeForce GTX 1060 3GB/PCIe/SSE2
Configuration:
Configuration file
shadow = true;
shadow-radius = 18;
shadow-offset-x = -7;
shadow-offset-y = -7;
shadow-exclude = [
"name = 'Notification'",
"class_g = 'Conky'",
"class_g ?= 'Notify-osd'",
"class_g = 'Cairo-clock'",
"_GTK_FRAME_EXTENTS@:c"
];
fading = false
fade-in-step = 0.01;
fade-out-step = 0.01;
inactive-opacity = 1;
frame-opacity = 1;
inactive-opacity-override = false;
focus-exclude = [ "class_g = 'Cairo-clock'" ];
blur:
{
method = "box";
size = 3;
}
blur-background = false
blur-background-exclude = [
"window_type = 'desktop'",
"name = 'win0'",
"_GTK_FRAME_EXTENTS@:c"
];
vsync = true;
mark-wmwin-focused = true;
mark-ovredir-focused = true;
detect-rounded-corners = true;
detect-client-opacity = true;
detect-transient = true
detect-client-leader = true
use-damage = true
xrender-sync-fence = true
#unredir-if-possible = true
log-level = "warn";
wintypes:
{
tooltip = { fade = false; shadow = false; opacity = 0.75; focus = true; full-shadow = false; };
dock = { shadow = false; full-shadow = false}
dnd = { shadow = false; }
popup_menu = { opacity = 0.8; }
}
Steps of reproduction
- Set
unredir-if-possible = truein picom.conf. - Put certain programs in fullscreen. * Have tested Chromium, Discord (running electron), and Audacious, which do not cause this issue. Firefox, GIMP, OBS Studio, and LibreOffice Writer do.
- Disable fullscreen for said program.
- Check nvidia-smi and observe xrestop
- Repeat step 2 a few times.
Expected behavior
VRAM on display GPU remains consistent.
Current Behavior
VRAM on GPU is continuously consumed, upon restart VRAM use is back to normal.
Footage of the issue
https://files.catbox.moe/u6s20m.mp4
can reproduce with gimp. affects only xrender backend only when unredir-if-possible enabled. holding f11 in gimp easily consumes about gigabyte vram in a few seconds.
edit: doesn't happen with legacy-backends
@bazettfraga, should be fixed starting from 5457d76de61f58ce03f5ebc70f4c0d0b6ec4a337. consider closing the issue if it doesn't happen to you anymore.
Not sure if I am experiencing the same issue but as soon as I launch picom the VRAM usage is 900mb, which isn't too bad considering I have 8gb on my 3070 but is 900mb the normal amount or does that seem a little high?
@Fxzzi, what backend you're using and what's your screen resolution?
@Fxzzi, what backend you're using and what's your screen resolution?
GLX backend, one monitor 2560x1440@170Hz, the other monitor 1920x1080@75Hz.
@Fxzzi, what's in your configuration file?
@bazettfraga, should be fixed starting from 5457d76. consider closing the issue if it doesn't happen to you anymore.
Hi! Sorry for the silence, I wasn't notified by Github last time for some reason. After building the latest version of picom I am glad to say that this bug no longer occurs at all. Thank you very much!
@Fxzzi, what's in your configuration file?
https://gitlab.com/fazzi/dotfiles/-/blob/main/.config/picom/picom.conf
Here is my picom config
@Fxzzi, interesting. i agree that 900mb of vram is a little bit too much for picom. the only idea i currently have is that you have two monitors and one of them has 2k resolution what means bigger windows, bigger shadow and blur textures and bigger vram usage. let's see if it's actually picom: could you provide the output of the nvidia-smi and the xrestop when picom is running?
@Fxzzi, yeah, picom indeed ate your vram and it's about gl resources. is vram usage the same for both glx and egl backends? and, just out of curiosity, what's the vram usage of the xrender backend?
let's see what @yshui or @tryone144 have to say about this, i'm not the gl expert.
@Fxzzi, yeah, picom indeed ate your vram and it's about gl resources. is vram usage the same for both glx and egl backends? and, just out of curiosity, what's the vram usage of the xrender backend?
let's see what @yshui or @tryone144 will say, i'm not the gl expert.
Yeah. Testing on GLX and EGL yields me similar VRAM numbers. Here's a screenshot of the vram usage on xrender. 6mib!