Sunshine icon indicating copy to clipboard operation
Sunshine copied to clipboard

DLSS frame generation versions 310.1 and 310.2 only streaming native frame rate (1/2) from host

Open nebuchadnezza343 opened this issue 10 months ago • 29 comments

Is there an existing issue for this?

  • [x] I have searched the existing issues

Is your issue described in the documentation?

  • [x] I have read the documentation

Is your issue present in the latest beta/pre-release?

This issue is present in the latest pre-release

Describe the Bug

With Nvidia’s new DLSS frame generation versions 310.1 and 310.2 the incoming frame rate on the client is only the "real" frames and not the generated frames. Therefore the framerate boost from DLSS FG is not working when streaming using DLSS FG *.dll newer than 3.8.1.

When forcing DLSS FG version 3.8.1 using DLSS Swapper the expected behavior is restored and frame rate incoming matches the rendered frame rate of the game running.

Expected Behavior

Transmission of all frames displayed on the host’s display also on the clients display.

Additional Context

I’m using a very specific setup:

Host NVIDIA GeForce 4090 with Sunshine on windows 11 2023H2 using capture settings default. Host pc has a g-sync LG TV via HDMI connected and Ethernet.

Client is a M4 Mac mini using HDMI with VRR (!) and Ethernet. Before DLSS FG 310.1 the stream was working with VRR up to 120hz without tearing on the client Mac. Since 310.1 the client only shows incoming frame rate of ~1/2 of the actual game fps (comparison host/client or afterburner fps overlay vs incoming frame rate overlay from moonlight).

Probable cause:

NVIDIA changed two things as far as the press is reporting: 50 series gpus now have a flip metering done on the gpu using frame generation and they reduced the VRAM impact/performance impact of DLSS FG with the newest drivers/dlls.

Host Operating System

Windows

Operating System Version

2023H2

Architecture

amd64/x86_64

Sunshine commit or version

v2025.130.210222

Package

Windows - installer (recommended)

GPU Type

NVIDIA

GPU Model

NVIDIA RTX 4090

GPU Driver/Mesa Version

572.16 WHQL and DLSS FG 310.1/310.2

Capture Method

Desktop Duplication API (Windows)

Config


Apps


Relevant log output

not relevant?

nebuchadnezza343 avatar Feb 01 '25 08:02 nebuchadnezza343

Further Testing:

566.36 captures generated frames with G-Sync + V-Sync and Frame Rate Limit in host machine set via NVCP.

572.16 captures generated frames only with same settings (G + V Sync on), but new driver lowers host and client frame rate to below max frame rate (116fps down to 80-110) with v-sync and g-sync on. Disabled V-Sync restores host performance to 116fps but does not capture generated frames (= client only 60-70fps)

nebuchadnezza343 avatar Feb 03 '25 08:02 nebuchadnezza343

I'm glad to see you got your account restored, and thanks for the additional info.

ReenigneArcher avatar Feb 03 '25 13:02 ReenigneArcher

I have the same issue, except my client and host are both 23H2 Windows PCs. Host is running a 5080, client is a 3080.

EDIT: After some tweaking, it's resolved. I had vsync forced on globally in the nVidia app on the host, which was causing the issue. Setting it to Fast resolved it.

Pirateguybrush avatar Feb 07 '25 09:02 Pirateguybrush

@nebuchadnezza343 I can confirm that with latest drivers (572.16) and both 310.1.0 and 310.2.0 FG dll's my 4090 can no longer sustain solid 116 fps (120 - 4 capped by ULLM or Reflex). I could easily reproduce this on Jedi Survivor and Howarts Legacy when overriding FG profile to "Latest" on the new NVIDIA App (sadly on Howarts Legacy the new FG preset is already installed from the last game update), with avg. fps sitting between ~80 and ~105.

Interestingly in Jedi Survivor when you alt-tab (whether in exclusive fullscreen or borderless) a solid 116 fps is recovered but once you start playing again for a few seconds/minutes or you pause the game the framerate starts going down again somewhere between ~105 and ~85 and stays there forever until you alt-tab again. The big issue here is not just lossing some frame smoothness but really the penalty on input latency, specially when fps go under ~96 (which accidentally is the exact framerate at which my LG C9 G-Sync compatible TV can show a tear-free image with V-Sync disabled!). Input latency on DLSS 3 FG + Reflex/ULLM is 30-40ms most of the time whereas DLSS 4 FG when this issue kicks in stays above 80ms and in some situations even well over 100ms! Also when this happens my Intel 13700K stays at ~40 watts on Jedi Survivor while on Howarts I can even see ~25 watts, meaning that the CPU is almost completely idle.

I think this issue might be related to 40 series not supporting frame metering by hardware; NVIDIA might be doing some sort of software magic trick to achieve a similar result, although this seems buggy as of now... Looks like at some point something is out of sync and the whole metering system goes to hell. Sadly though no one seems to be complaining about this on the internet just yet.

mvillarg avatar Feb 08 '25 21:02 mvillarg

@mvillarg Thank you. This is exactly my thoughts put into nice writing.

There are 2 issues now as far as I can tell:

  1. Sunshine not capturing all frames with Frame Generation ON an V-Snyc OFF using G-Sync ON. -> Sunshine fix needed?

  2. Driver 572.16 and 572.24 not working properly with V-Sync On and FG On using G-Sync. Maybe this is due to frame metering or the following fix in the newest drivers: „Certain G-SYNC Compatible monitors may display flickering when game FPS drops below 60FPS [5003305]“. I really hope this issue is not only with LG OLED TVs so it’s not being overlooked by the public/nvidia. -> NVIDIA fixing the issue needed?

I tried raising the issue as well in the German hardwareluxx.de forum but so far no useful response was given there. https://www.hardwareluxx.de/community/threads/572-16-v-sync-g-sync-problem.1363693/

I’ll write another but report to NVIDIA.

@mvillarg what Windows version are you running? 2023H2 by any chance? Maybe 2024H2 fixes the Issue but I’m unable to update right now due needing Windows Mixed Reality which was removed after 2023H2.

nebuchadnezza343 avatar Feb 08 '25 23:02 nebuchadnezza343

@nebuchadnezza343 I have encountered these exact issues on 24H2 also.

Prior to that these were the symptoms I was experiencing with the WGC capture method after upgrading to the host to 24H2. This was well prior to the 572.x driver release.

Snarfster avatar Feb 09 '25 10:02 Snarfster

@nebuchadnezza343 thanks for mentioning 2023H2. It turns out I was totally convinced that I was on 24H2 already but I was not. So I manually updated to 24H2 and the problems I commented are now solved 💯 . I cannot comment on your specific issues with Sunshine as I'm not currently using it (I will again once the new XREAL ONE Pros go live next month).

mvillarg avatar Feb 09 '25 20:02 mvillarg

+1 for this issue. When using latest dlss preset K FG, the FPS on my monitor in 2077 is 90+, but I only receive 50+ on my tablet. I tried FSR 3 and there was no issue at all.

CHNtentes avatar Feb 12 '25 17:02 CHNtentes

I can confirm that i'm having the exact same issue. Silent Hill 2 + Frame Generation running at 150 FPS, but Moonlight only reports half of those frames. Without frame generation enabled, the FPS on both host and client match. Using the latest DLSS DLL (310.2.1) and the latest Frame gen DLL (310.2).

Image

@nebuchadnezza343 did you manage to find a solution?

Windows 11 24H2 Nvidia Driver 572.42

elisoftli avatar Feb 13 '25 19:02 elisoftli

Hello @elisoftli !

Not really.

  • Either downgrade to 566.xx (and probably lose the transformer model DLSS 4.)
  • Force V-Sync On in NVCP (with G-Sync for my setup) and get horrendous latency ~100ms and only 95fps instead of 116 frame cap limit.

I’m really wondering why so few people report the G-Sync + V-sync problem? Is it LG TV specific? OS Specifig? Anyway, this still leaves the change / bug with 572 and moonlight not capturing all frames with FG on and v-sync off.

nebuchadnezza343 avatar Feb 13 '25 20:02 nebuchadnezza343

Hello @elisoftli !

Not really.

  • Either downgrade to 566.xx (and probably lose the transformer model DLSS 4.)
  • Force V-Sync On in NVCP (with G-Sync for my setup) and get horrendous latency ~100ms and only 95fps instead of 116 frame cap limit.

I’m really wondering why so few people report the G-Sync + V-sync problem? Is it LG TV specific? OS Specifig? Anyway, this still leaves the change / bug with 572 and moonlight not capturing all frames with FG on and v-sync off.

Probably not an LG specific issue. I ran into this issue on various devices (mobile & TV). So far, seems like the easiest interim solution is to downgrade the Frame Gen DLLs to 3.8.1 (Easily done via DLSS Swapper).

elisoftli avatar Feb 13 '25 21:02 elisoftli

@nebuchadnezza343 for me the way to go around the massive input latency with G-SYNC + V-Sync + Latest Drivers + DLSS FG 4 + W11 23H2 on my LG OLED TV was to turn V-Sync off and cap framerate to 108 fps (both at NVCP). This way FG works with the minimum latency possible (~32ms), while avoiding screen tearing for the most part (barely visible at the 5%-bottom of the screen). Lowering fps even more, let's say 90-100fps, would amost entirely fix screen tearing at the cost of slightly worse input latency. Believe it or not, when using any sort of frame interpolator any extra fps counts on the overall responsiveness of the game. Not sure whether disabling V-Sync while using DLSS FG 4 still causes Sunshine to drop half of the frames though!

mvillarg avatar Feb 15 '25 11:02 mvillarg

Thanks; yeah I’m just pi**ed about any tearing because it was perfect before (566) with G+V-Sync and good fps with little latency (40-45ms, pathtracing/ultra 4K 116fps).

I really hope NV fixes the issue or moonlight finds a way to capture all FG frames with v-sync off

nebuchadnezza343 avatar Feb 15 '25 11:02 nebuchadnezza343

I truly think kind of a "dark age" is coming for any video capture mechanism run at a software level (hardware-based screen captures will always be able to feed directly from the GPU signal output whether that is DisplayPort or HDMI) when it comes to this new frame metering and multi-frame gen systems. I'm expecting both Intel and AMD future gpu's to follow the same path, provided that pure raster/ray-traced horse power will stagnate in the coming years (it's going to be a loooong way for beefy gpu's to see ~2nm silicon nodes that are able to improve transistor efficiency in a significant way, or a mix of photonic hardware + chiplet design that can reduce power consumption in such a way that compute horse power can notably increase year over year like in the good old days...

NVIDIA is adding most of the new transistors that is able to fit on its monolithic chip into the tensor cores (also a few of those go to ray-tracing cores) at the cost of even higher power consumptions... So we're at a point where the only way to achieve higher framerates is by inserting AI generated frames between the real frames that the GPU is really able to render. The nasty part is that for this to be done efficiently (no os-cpu-driver-gpu bottlenecks) and perfectly synced with the G-Sync-ready TV or monitor, everything must be done in the GPU driver-to-hardware realm, leaving software such as RTSS, Sunshine or Moonlight in the dust...

And of course, not even worth mentioning input latency with this new AI rendering solutions... My hope is that NVIDIA keeps iterating/improving on REFLEX technology (2.0 is already bringing quite a lot of improvement over the current version) but the holy grail will be when everybody (next major releases of both DirectX and Vulkan + NVIDIA/AMD/Intel drivers) provides support in a standarized way to render frames (either full frames or parts/objects of those frames) using AI. This system would work in such a way that the CPU would only submit real draw calls to the GPU driver 1 out of 10 frames, while in the other 9 would just submit transformation info (position, scale, rotation and probably other meta-data associated to user input) and the GPU would draw those frames at a fraction of the cost of the real rasterized frame. In this "neural-rendering" paradigm, AI upscalers such us DLSS or FSR would still be able to operate in exchange of more noticeable image artifacts.

Interesting times indeed...

mvillarg avatar Feb 15 '25 13:02 mvillarg

I’ve also narrowed it down to DLSSG (Frame Generation DLL) Version 3.10 or higher. Forcing 3.8.1 (only the FG DLL) works with latency and fps as expected.

So two solutions:

  1. Use latest Nvidia driver but force DLSS FG version 3.8.1

Or

  1. User NVIDIA 566.36 and force DLSS+FG+RR version 3.10+ and use NVPI to select transformer model

nebuchadnezza343 avatar Feb 20 '25 14:02 nebuchadnezza343

Thanks for this workaround! I was playing AC Shadows which comes with DLSSG 3.10 by default and found it weird that the game felt laggy when below 80fps mark in host machine fps counter, I thought the game had some latency issue (apparently my eyes see at 40fps xD), luckily I found this thread. After I downgraded to DLSSG 3.8 it finally felt smooth even when I increase the preset to 4k DLSS Quality (everything maxed on a 4090).

charnet3d avatar Mar 29 '25 06:03 charnet3d

This issue still affects 576.02 driver release. So this might not be considered a bug by NVIDIA. Will be a real shame if it becomes impossible to capture them, will totally kill 4k streaming for the vast majority of people.

I do not believe it is at all related to G-Sync either, same results when my dummy plug or primary monitor is active.

Tried capturing via OBS, I am not well versed in how to use it but the captured video did not look smooth vs. the source. Will need to test Steam Remote Play and NVIDIA’s own capture solution.

I’ll probably try creating a ticket with NVIDIA. Who knows if they’ll do anything about it but the last one I made did get fixed so I have some hope.

ody avatar Apr 17 '25 07:04 ody

This issue continues to affect today’s 576.02 driver release

It might not be the perfect workaround, but I did manage to keep using the latest FG version and have the full amount of frames streamed to the client by using a Virtual Display instead of my monitor or a dummy HDMI plug. I used Apollo since it automates the whole process, but there are plenty of other implementations that should also work.

elisoftli avatar Apr 17 '25 07:04 elisoftli

This issue continues to affect today’s 576.02 driver release

It might not be the perfect workaround, but I did manage to keep using the latest FG version and have the full amount of frames streamed to the client by using a Virtual Display instead of my monitor or a dummy HDMI plug.

I thought of trying VDD last night also but ran out of time and played sitting at my computer instead. It’ll be strange if the VDD can use Frame Generation but I believe they actually copy each frame so maybe that action is late enough in the pipeline.

Thank you for trying and mentioning it worked but I won’t be using Apollo. I think its existence is pointless. Running the https://github.com/VirtualDrivers/Virtual-Display-Driver is not complicated.

ody avatar Apr 17 '25 15:04 ody

I have confirmed that replacing my dummy with the upstream VDD results in Moonlight and RTSS reporting 90 fps streaming to my Steam Deck.

Even with all the improvements in VDD in the last year, I still find it to behave strangely at times so have preferred using a dummy plug. The VDD seems to make Sunshine more sensitive to lack of movement and I'll see a fps drop in Moonlight if there is a lot of static content on the screen, e.g. Minecraft. Even testing now in CP2077 I had to run around a bit after first loading the game to get the VDD to "warm-up" for Moonlight to report 90 fps, it started out fluctuating between 70 and 85.

This is annoying but guess I am rolling with a VDD going forward unless something changes within NVIDIA's driver or someone finds the right hook into their new frame generation method to extract all the frames.

ody avatar Apr 17 '25 16:04 ody

I am partially retracting my previous response. VDD only makes the situation a little better, frames start dropping at random. I’ll be running at a smooth 90 fps then dip down to 55 then flutter about sub-80 for a while before returning to 90.

Quite annoying. Guess some additional time will be needed to test other things. Hopefully NVIDIA’s our game capture software isn’t picking up frames also. Else wise, is there anyway to interact with them through an open source library or developer program?

ody avatar Apr 19 '25 07:04 ody

https://forums.developer.nvidia.com/

ReenigneArcher avatar Apr 19 '25 11:04 ReenigneArcher

NVIDIA happily ignores their forums for the most part. The best chance to actually hear from them is to report a bug, since they actually track those.

ns6089 avatar Apr 28 '25 13:04 ns6089

I have confirmed that replacing my dummy with the upstream VDD results in Moonlight and RTSS reporting 90 fps streaming to my Steam Deck.

Even with all the improvements in VDD in the last year, I still find it to behave strangely at times so have preferred using a dummy plug. The VDD seems to make Sunshine more sensitive to lack of movement and I'll see a fps drop in Moonlight if there is a lot of static content on the screen, e.g. Minecraft. Even testing now in CP2077 I had to run around a bit after first loading the game to get the VDD to "warm-up" for Moonlight to report 90 fps, it started out fluctuating between 70 and 85.

This is annoying but guess I am rolling with a VDD going forward unless something changes within NVIDIA's driver or someone finds the right hook into their new frame generation method to extract all the frames.

I honestly haven’t had issues with VDD ever since I discovered that the FPS issues only happen with games that still use exclusive full screen mode.

Also I can’t seem to reproduce this bug at all on my machine, although I use a specific setup. Like this original poster, I also force VSYNC + Ultra Low Latency Mode

But I use VDD with my scripts, which force the display to 244hz and applies a frame cap, which essentially disables vsync and removes the -3 limit NVIDIA does on low latency mode: https://github.com/Nonary/documentation/wiki/DummyPlugs#tips-for-gsync-users

Nonary avatar May 08 '25 05:05 Nonary

But I use VDD with my scripts, which force the display to 244hz and applies a frame cap, which essentially disables vsync and removes the -3 limit NVIDIA does on low latency mode: https://github.com/Nonary/documentation/wiki/DummyPlugs#tips-for-gsync-users

Since I posted my last reply I discovered that the frame rate dips only occur if you run the VDD at the same refresh rate as requested stream fps. So you would not experience the issue if you always configure the VDD for 244hz and cap below it. I use native Sunshine display management so I chose to add an override for all the resolutions I stream at which doubles the refresh rate of the VDD, setting it to twice the requested stream fps.

ody avatar May 08 '25 07:05 ody

It is a bug with NVIDIA, but the recent changes to Sunshine sort of makes this not much of an issue anymore.

Streams will no longer dynamically adjust framerate on drops as it did in the past due to a new default, which indirectly makes it so DLSS will stream at full capture rate now.

Additionally, when #4066 gets merged in and if you decide to swap to WGC this is also fixed because WGC always captures at a constant rate.

Nonary avatar Aug 01 '25 01:08 Nonary

Sorry to bump this thread, but just want to let people who care about it know the PR is closed without merging :(

Sorry to see the disagreement

HicirTech avatar Aug 20 '25 02:08 HicirTech

I know this is filed as a bug / issue, but it could also be an interesting "feature" for certain edge cases. For example: You have Frame Generation or Smooth Motion configured on your high refresh rate host, and you don’t always need the generated frames for example when streaming to 60 Hz clients like TVs or Steam Decks.

in those cases, only sending the native frames alone are fine, and this bug could actually be an alternative to always having to switch fg/smooth motion off and on everytime u stream

So once this issue is fixed, it might be nice if this bug could also be turned into an optional toggle in Sunshine for those specific scenarios.

this is related to my previous suggestion in the sunshine discussion forums https://github.com/orgs/LizardByte/discussions/722

lyndonguitar avatar Aug 23 '25 07:08 lyndonguitar

Hello everyone, sorry for bringing this thread back, but I’m facing the same issue with the latest NVIDIA drivers, 580.95.05, and an RTX 5080, using Sunshine + Moonlight.

Basically, the same thing you’ve already mentioned: Sunshine doesn’t capture the frames generated by frame generation (DLL 310.4), whether using DX or WGC as the capture API, on a real monitor.

It does work correctly when using the Virtual Display Driver, but I don’t think that should be the solution to this matter.

Any ideas? Thanks!!

fernandoenzo avatar Nov 03 '25 16:11 fernandoenzo