[Linux] Bizzare performance; gamble if it runs well or not.
Validation
- [X] I have checked the Issues page to see if my problem has already been reported
- [X] I have confirmed that this bug does not occur in the original game running on original Xbox 360 hardware
If you have DLC installed, please specify which ones you have.
- [X] Apotos & Shamar Adventure Pack
- [X] Chun-nan Adventure Pack
- [X] Empire City & Adabat Adventure Pack
- [X] Holoska Adventure Pack
- [X] Mazuri Adventure Pack
- [X] Spagonia Adventure Pack
If you have mods enabled, please specify which ones you have.
I disabled all mods just to see if this issue persists, and it does.
If you have codes enabled, please specify which ones you have.
Disabling all codes also causes this.
Describe the Bug
There's no clear way to go about this, sometimes the game just runs abysmally in intense areas. An example is in Empire City's village, when looking towards the level gate, the game runs at around 30 FPS, but turning the camera around runs at 60 FPS. However, this is randomly dependant on when I launch the game. Sometimes, the game runs at 60 FPS just fine in all areas. But then I close it, do somethings, then come back to the game, and it consistently runs poorly. If I keep closing and re-opening the game, I might eventually see it run at 60 FPS again.
Screenshots
Here's an image of the performance graph as I rotate the camera in Empire City.
Same graph but it's been looking at the scene long enough to fill up the whole graph.
And here's the performance graph when everything is running correctly; as I said it's random when the game decides to work or not. This is the exact same scene in Empire City.
Provided is also a screenshot of LACT, showing my GPU's performance. The dips in the power usage indicate the game lagging, meaning the GPU is deciding to do less work, for unknown reasons. My CPU isn't being hammered by the game.
And here's LACT when the GPU is actually rendering the game at 60 FPS.
Specifications
Fill out the following details:
- CPU: AMD Ryzen 5 5600G (iGPU is disabled in BIOS, game does not report it being used)
- GPU: AMD Radeon RX 6600 XT
- GPU Driver: amdgpu; Mesa 25.0.0 (LLVM 19.1.7)
- OS: Ultramarine Linux 41 (Cyberia) x86_64
- Version: 1.0.2
Additional Context
I have tried various fixes on my end; Using LD_PRELOAD="", doing both amdgpu.aspm=1 and amdgpu.aspm=0 in GRUB, using Cinnamon instead of KDE Plasma (so, X11 vs Wayland; doing X11 under wayland also didn't fix this), using CoreCTRL and LACT to manually set GPU frequencies (memory frequenices can be set, GPU frequencies are ignored for unknown reasons), and closing all other applications doesn't affect the results in any meaningful way.
Smells like the GPU is underclocking. This is an issue on NVIDIA as well when the games are running too well for the driver to think it's running a game anymore. Does Mesa provide you a way to set the power profile similarly to how NVIDIA does it?
My reasoning is, look at the orange line on the profiler. GPU Frametimes can't be jumping an order of magnitude (5 ms to 10 ms) on its own without the driver doing the underclocking by itself.
Also try V-Sync/FPS Limit off I guess to verify if it does the same behavior with the GPU clocks or if it stays stable.
LACT and CoreCTRL provide ways to tell the GPU to set the clocks to max, but as I stated all it seems to do for me is set the memory frequencies, the GPU frequencies are never matched for what the GPU is told to do (you can see as much in the graphs with GPU Average and GPU Target)
I've unset v-sync before and it didn't do anything, so I also set the FPS limit to max and that also didn't improve performance.
Weirdly I think I seem to have found a semi-consistent way to set this up, I can switch between "laggy" and "normal" modes by locking the screen, then turning off the screen, and waiting for KDE to make the disconnect and reconnected device sounds. I don't think it's KDE's fault, since as I said before, this also happens on Cinnamon. Which also means it's not Wayland nor X11's fault.
It's not the display protocol for sure. The consistent thing that seems to go massively up is the GPU frame times, which are entirely on the driver side, unless for whatever reason the driver is somehow inserting something else in the command buffer.
Hmmm that is interesting, I have heard other people complain about the 6000 series of AMD Radeon GPUs causing issues on Linux, though trying the various work-arounds other people have discussed hasn't yielded much for me. Issues such as the 4GB BAR limit were already resolved long ago for me, and other such things like messing with GRUB boot parameters didn't do much to alleviate the issue either.
I will say I just switched my kernel to the current mainline kernel (6.14 RC6) and that hasn't solved the issue unfortunately, I'll further try to experiment with older kernel versions to see if that alleviates the problem.