Kingdom Come Deliverance 2 XeSS 10 FPS Limit
Game name and version:
Kingdom Come Deliverance 2 2025 May 15 version, also February release version.
Mods and mod versions used
v0.7.7-pre9 and newest version tested at game release
GPU
GTX 1660 Super
OS
Win 11
Used automated or manual install?
- [x] Automated
- [ ] Manual
If on AMD/Intel and Automated, used DLSS inputs?
- [ ] Yes
- [ ] No
Also please mention in description which upscaler inputs are used (which upscaler you selected in game settings).
Did you check the Wiki?
- [x] Yes
- [ ] No
Please describe the issue and steps to reproduce it
After a while of using XeSS (from DLSS in-game) the game gets limited to 10 FPS. Nothing can change this limiting, only game restart or switching to different upscaler (FSR), If I switch back to XeSS, it gets limited to 10 FPS again.
I have attached
- [x] OptiScaler.log (set
LogLevel=0andLogToFile=truein Optiscaler.ini, zip it if too big) - [x] Screenshot of game folder (where you placed Opti) I inject via Special K, but the problem is same in game folder (tested both and tested with only optiscaler and no other modification)
- [x] Screenshot of Opti overlay in-game (opens with shortcut, default Insert)
Hi,
Log file you have provided have no info about initial loading of Opti or creation and frames of XeSS context. It just have some FSR3.1 upscaled frame info, which will not help us in current situation.
Hi, It seems to be a VRAM issue. When it gets filled up the game get limited to 10 FPS with XeSS. I tried with FSR-input as well and same result.
According to Special K, the game has 5GB VRAM budget at start on my card (6GB physical VRAM) and when it gets over the budget the game gets limited. It only happens with XeSS. The budget increases on demand, but optiscaler XeSS can't handle it in this game for some reason.
This is with in-game FSR (Optiscaler FSR also works without problem, only XeSS looks much better):
If you want to try if a newer build has any improvements for this, you can try Pre12 linked here - https://github.com/cdozdil/OptiScaler/issues/484#issuecomment-2927240516
If you want to try if a newer build has any improvements for this, you can try Pre12 linked here - #484 (comment)
I tried it to no avail. The problem still persists.
I'm kinda curious, is the issue still there if you select DLSS inside OptiScaler? Opti should let you do it on this card.
Also could set these ini values and try again?
; Building pipeline for XeSS before init
; true or false - Default (auto) is true
BuildPipelines=false
; Creating heap objects for XeSS before init
; true or false - Default (auto) is false
CreateHeaps=false
I'm kinda curious, is the issue still there if you select DLSS inside OptiScaler? Opti should let you do it on this card.
With DLSS, there is no issue, but the performance is very poor. Is it even real DLSS? How does that work, is it emulated in software, similarly to how RTX raytracing is emulated on 16-series GPUs?
Also could set these ini values and try again?
; Building pipeline for XeSS before init ; true or false - Default (auto) is true BuildPipelines=false
; Creating heap objects for XeSS before init ; true or false - Default (auto) is false CreateHeaps=false
With these edits, it seems to be fixed now. I will continue testing.
GTX 1600 series can be modified to run DLSS (since Turing), albeit since no Tensor cores, it's very heavy, Transformer model also, but a loooot more heavier.
Also, nice, interested if it's fixed now.