Sunshine
Sunshine copied to clipboard
Can't use second GPU to encode the stream
Is there an existing issue for this?
- [X] I have searched the existing issues
Is your issue described in the documentation?
- [X] I have read the documentation
Is your issue present in the nightly release?
- [X] This issue is present in the nightly release
Describe the Bug
I installed a second GPU (GTX 1050 Ti) in my computer that I want to use to encode the stream, so I can fully use my primary GPU (RTX 4080) for gaming performance. I tried to play with the adapter_name option, but it didn't work, my primary GPU was used for encoding the stream no matter what.
Expected Behavior
Being able to use the second GPU to encode the stream.
Additional Context
No response
Host Operating System
Linux
Operating System Version
Ubuntu 23.10
Architecture
64 bit
Sunshine commit or version
0.21.0
Package
Linux - AppImage
GPU Type
Nvidia
GPU Model
RTX 4080 & GTX 1050 Ti
GPU Driver/Mesa Version
545.29.06
Capture Method (Linux Only)
X11
Config
adapter_name = /dev/dri/renderD129
min_threads = 5
fps = [10,30,60,90,120,144]
sw_preset = fast
Apps
No response
Relevant log output
[2023:12:04:10:11:18]: Info: Sunshine version: v0.21.0
[2023:12:04:10:11:18]: Info: System tray created
[2023:12:04:10:11:18]: Error: Failed to create session: This hardware does not support NvFBC
[2023:12:04:10:11:18]: Error: Couldn't expose some/all drm planes for card: /dev/dri/card0
[2023:12:04:10:11:18]: Info: Detecting monitors
[2023:12:04:10:11:18]: Info: Detected monitor 0: HDMI-0, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 1: DP-0, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 2: DP-1, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 3: DP-2, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 4: DP-3, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 5: DP-4, connected: true
[2023:12:04:10:11:18]: Info: Detected monitor 6: DP-5, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 7: DVI-D-1-0, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 8: HDMI-1-0, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 9: DP-1-0, connected: false
[2023:12:04:10:11:18]: Info: Detected monitor 10: DP-1-1, connected: false
[2023:12:04:10:11:18]: Info: // Testing for available encoders, this may generate errors. You can safely ignore those errors. //
[2023:12:04:10:11:18]: Info: Trying encoder [nvenc]
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:18]: Info: Color depth: 8-bit
[2023:12:04:10:11:18]: Info: Color range: [JPEG]
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:18]: Info: Color depth: 8-bit
[2023:12:04:10:11:18]: Info: Color range: [JPEG]
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:18]: Info: Color depth: 8-bit
[2023:12:04:10:11:18]: Info: Color range: [JPEG]
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: SDR color coding [Rec. 709]
[2023:12:04:10:11:18]: Info: Color depth: 10-bit
[2023:12:04:10:11:18]: Info: Color range: [JPEG]
[2023:12:04:10:11:18]: Error: cuda::cuda_t doesn't support any format other than AV_PIX_FMT_NV12
[2023:12:04:10:11:18]: Info: Screencasting with X11
[2023:12:04:10:11:18]: Info: SDR color coding [Rec. 709]
[2023:12:04:10:11:18]: Info: Color depth: 10-bit
[2023:12:04:10:11:18]: Info: Color range: [JPEG]
[2023:12:04:10:11:18]: Error: cuda::cuda_t doesn't support any format other than AV_PIX_FMT_NV12
[2023:12:04:10:11:18]: Info:
[2023:12:04:10:11:18]: Info: // Ignore any errors mentioned above, they are not relevant. //
[2023:12:04:10:11:18]: Info:
[2023:12:04:10:11:18]: Info: Found H.264 encoder: h264_nvenc [nvenc]
[2023:12:04:10:11:18]: Info: Found HEVC encoder: hevc_nvenc [nvenc]
[2023:12:04:10:11:18]: Info: Found AV1 encoder: av1_nvenc [nvenc]
[2023:12:04:10:11:18]: Info: Adding avahi service Sunshine
[2023:12:04:10:11:18]: Info: Configuration UI available at [https://localhost:47990]
[2023:12:04:10:11:19]: Info: Avahi service Sunshine successfully established.
[2023:12:04:10:11:40]: Info: // Testing for available encoders, this may generate errors. You can safely ignore those errors. //
[2023:12:04:10:11:40]: Info: Trying encoder [nvenc]
[2023:12:04:10:11:40]: Info: Screencasting with X11
[2023:12:04:10:11:40]: Info: Screencasting with X11
[2023:12:04:10:11:40]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:40]: Info: Color depth: 8-bit
[2023:12:04:10:11:40]: Info: Color range: [JPEG]
[2023:12:04:10:11:40]: Info: Screencasting with X11
[2023:12:04:10:11:40]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:40]: Info: Color depth: 8-bit
[2023:12:04:10:11:40]: Info: Color range: [JPEG]
[2023:12:04:10:11:41]: Info: Screencasting with X11
[2023:12:04:10:11:41]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:41]: Info: Color depth: 8-bit
[2023:12:04:10:11:41]: Info: Color range: [JPEG]
[2023:12:04:10:11:41]: Info: Screencasting with X11
[2023:12:04:10:11:41]: Info: SDR color coding [Rec. 709]
[2023:12:04:10:11:41]: Info: Color depth: 10-bit
[2023:12:04:10:11:41]: Info: Color range: [JPEG]
[2023:12:04:10:11:41]: Error: cuda::cuda_t doesn't support any format other than AV_PIX_FMT_NV12
[2023:12:04:10:11:41]: Info: Screencasting with X11
[2023:12:04:10:11:41]: Info: SDR color coding [Rec. 709]
[2023:12:04:10:11:41]: Info: Color depth: 10-bit
[2023:12:04:10:11:41]: Info: Color range: [JPEG]
[2023:12:04:10:11:41]: Error: cuda::cuda_t doesn't support any format other than AV_PIX_FMT_NV12
[2023:12:04:10:11:41]: Info:
[2023:12:04:10:11:41]: Info: // Ignore any errors mentioned above, they are not relevant. //
[2023:12:04:10:11:41]: Info:
[2023:12:04:10:11:41]: Info: Found H.264 encoder: h264_nvenc [nvenc]
[2023:12:04:10:11:41]: Info: Found HEVC encoder: hevc_nvenc [nvenc]
[2023:12:04:10:11:41]: Info: Found AV1 encoder: av1_nvenc [nvenc]
[2023:12:04:10:11:41]: Info: Executing [Desktop]
[2023:12:04:10:11:41]: Info: 192.168.0.16: Ping Timeout
[2023:12:04:10:11:41]: Info: CLIENT CONNECTED
[2023:12:04:10:11:41]: Info: Detecting monitors
[2023:12:04:10:11:41]: Info: Detected monitor 0: HDMI-0, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 1: DP-0, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 2: DP-1, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 3: DP-2, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 4: DP-3, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 5: DP-4, connected: true
[2023:12:04:10:11:41]: Info: Detected monitor 6: DP-5, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 7: DVI-D-1-0, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 8: HDMI-1-0, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 9: DP-1-0, connected: false
[2023:12:04:10:11:41]: Info: Detected monitor 10: DP-1-1, connected: false
[2023:12:04:10:11:41]: Info: Screencasting with X11
[2023:12:04:10:11:41]: Info: Configuring selected monitor (0) to stream
[2023:12:04:10:11:41]: Warning: Couldn't get requested display info, defaulting to recording entire virtual desktop
[2023:12:04:10:11:41]: Info: SDR color coding [Rec. 601]
[2023:12:04:10:11:41]: Info: Color depth: 8-bit
[2023:12:04:10:11:41]: Info: Color range: [MPEG]
[2023:12:04:10:11:41]: Info: Setting default sink to: [sink-sunshine-stereo]
[2023:12:04:10:11:41]: Info: Found default monitor by name: sink-sunshine-stereo.monitor
[2023:12:04:10:15:04]: Info: Quitting from system tray
[2023:12:04:10:15:04]: Info: Interrupt handler called
[2023:12:04:10:15:04]: Info: Setting default sink to: [alsa_output.pci-0000_01_00.1.hdmi-stereo]
Using the env variable __NV_PRIME_RENDER_OFFLOAD=1 and CUDA_VISIBLE_DEVICES=1, it seems sunshine runs on the second GPU. I'm getting another error now, when I start streaming I just get a black screen on the client for a few seconds and then it stops with an error "No video received from host". If I don't use the env variables, sunshine starts on my primary GPU and everything is working fine.
The GPU I want to use to encode has to be the one with the display attached on ?
This is pointless. The encoder hardware is separate to the rest of the functions of the card. Plus you'd loose performance pushing frames across the PCIe bus.
Not only do you not want to do this for performance reasons (as it will degrade performance) it wasn't actually reducing the available GPU performance anyway.
Also the reason it kind of works is by enabling prime, your semi-setting up a situation where the 4080 renders to the frame buffer on the 1050... Which also would impact performance.
You really don't want to do this.
The only times you want this is if environmental factors force you to.
This is true in Windows-based multi session environments (where DXGI mirroring isn't available), and even then it's not a question of gaining performance but rather to gain access to additional hardware encoding contexts (because these are limited and could become a fought-over resource in an environment that makes excessive use of them).
I actually maintain a specialized fork of Sunshine for exactly that purpose but it is just that... a very special use-case that pretty much only applies to exactly this one scenario.
@Black-Seraph Oh sure, that makes sense and you can't avoid the performance hit. But I don't think that is what the person creating this ticket was going for as they appear to be on Linux.
It seems this issue hasn't had any activity in the past 90 days. If it's still something you'd like addressed, please let us know by leaving a comment. Otherwise, to help keep our backlog tidy, we'll be closing this issue in 10 days. Thanks!
This issue was closed because it has been stalled for 10 days with no activity.
I actually tested this out of curiosity. Setup: RTX 4090 as rendering card Titan X as video encode card and display card
The performance uplift of using Titan X as encoding card was marginal as you can see via attached images, around ~3.5 fps with average above 90 FPS. But this is not realistic.
Firstly, 4090 was artificialy limited to only 33% of its power limit, so the rendering parts of chip had to fight with NVENC for power budged. Secondly, that was 90 fps at 1440p transferred via the PCIE 3.0 x8 from one gpu to the other as mentioned by @insanemal In high FPS high resolution scenario that would become bottleneck. Maybe not with both gpus running at pcie gen 4 x16 on some Threadripper platform but lets be real, this is not your average gaming ring.
Maxwell was eating over 100 watts to encode up to 160 fps max at 1440p with lowest quality settings available in Sunshine, so P1, no adaptive quantization, no 2 pass encoding. That is a lot of additional power/noise. With max quality settings, Maxwell could push only ~60-70 fps.
For comparison, my 4090 could push up to ~155 fps with lowest quality settings and up to ~110 fps with max quality settings WHILE being power limited to 33% (~150 watts) and running cyberpunk.
If you got modern GPU from 4000 series, it makes no sense to use second GPU for encoding, as you get ~3% more fps in artificially created scenario.
RTX 4090 encode:
Maxwell encode:
IMO the only moment it would make sense is if you got two old inefficient GPUs like two Maxwell era. Then offloading encoding to second GPU could be significant, as it could free up to 100 watts of power budget available for chip die.
My testing method was totally unscientific so the results I got must be taken with a big grain of salt.