nvidium
nvidium copied to clipboard
Unable to select proper GPU
Minecraft keeps running with my GTX960 instead of my RTX2070 and I can't figure out how to point it at the correct graphics card.
use nvidia control panel or settings to select minecraft (minecraft is javaw.exe) to use the 2070 by default
Ah, my mistake, I forgot to mention I run Linux. Garuda, with the official NVIDIA drivers. (it's arch BTW).
Minecraft keeps running with my GTX960 instead of my RTX2070 and I can't figure out how to point it at the correct graphics card.
I'm curious on how that works, like how do you render your game on another gpu then the one your monitor is plugged into
Minecraft keeps running with my GTX960 instead of my RTX2070 and I can't figure out how to point it at the correct graphics card.
I'm curious on how that works, like how do you render your game on another gpu then the one your monitor is plugged into
Dunno, but that is what is currently happening. The game launches on my Primary Monitor which is plugged into my RTX2070, but when I look at BTOP, it says GPU0 (my GTX960) is pegged at full use.
what does it say in the f3 screen what does it say when you run nvidia-smi
seeing as your using linux, the solution after a quick search just seems to be installing nvidia-prime package and from there launching the program and selecting the correct gpu
I apologize, as I have not needed to use nvidia-prime before, what would this look like? My guess is to edit the properties of the Prism Launcher .desktop entry to specify prime-run? Or will that only affect Prism Launcher?
that sounds kinda cursed as a solution
but I'm wondering what do you need that 960 for
minecraft runs with java, so you need to find where java program is located (should be somewhere in /usr/lib/jvm according to arch wiki. Note that this also means all java programs that use the gpu will select the 2070 instead) so minecraft runs with the correct gpu
selecting prism launcher will just make the launcher run with the 2070 instead of minecraft itself
i will try and test this out on a vm later to make sure
but I'm wondering what do you need that 960 for
I run basic programs (browser, Discord, BTOP, OBS, etc) on the smaller card, and more demanding tasks (Games, Rendering, etc) on the bigger one.
minecraft runs with java, so you need to find where java program is located (should be somewhere in /usr/lib/jvm according to arch wiki. Note that this also means all java programs that use the gpu will select the 2070 instead) so minecraft runs with the correct gpu
selecting prism launcher will just make the launcher run with the 2070 instead of minecraft itself
i will try and test this out on a vm later to make sure
Well one solution is to install a Java instance for Minecraft specifically (I typically grab Azul Zulu for use with this Faster Random).
I run basic programs (browser, Discord, BTOP, OBS, etc) on the smaller card, and more demanding tasks (Games, Rendering, etc) on the bigger one.
Idk why but that feels kinda cursed and I think having the OS put the 2 outputs together would actually hurt performance, I wouldn't recommend doing that. 2 gpus could be useful for dual monitor setups but yeah, I don't think you should be using that.
I run basic programs (browser, Discord, BTOP, OBS, etc) on the smaller card, and more demanding tasks (Games, Rendering, etc) on the bigger one.
Idk why but that feels kinda cursed and I think having the OS put the 2 outputs together would actually hurt performance, I wouldn't recommend doing that. 2 gpus could be useful for dual monitor setups but yeah, I don't think you should be using that.
I have 5 monitors.
4 on the GTX960 and 1 (4K) on the RTX2070.
quick question, are you on wayland or x11
quick question, are you on wayland or x11
x11
you could try messing around in Nvidia X Server Settings, it seems like you can assign monitors to gpus in there, in the x server display configuration section
you could try messing around in Nvidia X Server Settings, it seems like you can assign monitors to gpus in there, in the x server display configuration section
No idea what i'm doing, also I don't know why GPU 0 & 1 are swapped in this program.
idk try figure it out ig
idk try figure it out ig
idk try figure it out ig
in the process of trying to "figure it out", I somehow managed to make the nvidia-settings app kill redshift (or whatever equivalent is used in Plasma) on launch, but I could not get "PRIME settings" tab to show up. I'm gonna wait for @Quantum39's findings for now.
idk try figure it out ig
I decided to take a break from the headache and now Baldur's Gate 3 isn't rendering any graphics.
idk try figure it out ig
the only thing I've "figured out" out of this is how to use BTRFS Snapshot restorations.
Well I guess I'll just go f▒▒ myself then.
@StuffOfSonny the issue was closed because it's not really an issue with nvidium so there's not really anything we can do.
@StuffOfSonny the issue was closed because it's not really an issue with nvidium so there's not really anything we can do.
You could perhaps implement a Graphics Adapter selector in the video settings. Plenrt of other games have such an option.
@StuffOfSonny the issue was closed because it's not really an issue with nvidium so there's not really anything we can do.
You could perhaps implement a Graphics Adapter selector in the video settings. Plenrt of other games have such an option.
But why should that be part of Nvidium? If anything, that should be a separate mod, and there is no reason why cortex should be the one to code this.
@StuffOfSonny the issue was closed because it's not really an issue with nvidium so there's not really anything we can do.
You could perhaps implement a Graphics Adapter selector in the video settings. Plenrt of other games have such an option.
But why should that be part of Nvidium? If anything, that should be a separate mod, and there is no reason why cortex should be the one to code this.
Ah yes, the endless cycle of "pass the buck".
@StuffOfSonny the issue was closed because it's not really an issue with nvidium so there's not really anything we can do.
You could perhaps implement a Graphics Adapter selector in the video settings. Plenrt of other games have such an option.
But why should that be part of Nvidium? If anything, that should be a separate mod, and there is no reason why cortex should be the one to code this.
Ah yes, the endless cycle of "pass the buck".
Respectfully, you are the one trying to pass the buck to cortex, without any reason why they should be the one to implement something that is actually really hard to do once in game.