desktop-waifu
desktop-waifu copied to clipboard
How to use GPU instead of CPU for faster translating?
I already installed CUDA and TorchPy, but dont know what I have to set and where I have to set it so the workload of the transcription goes to my GPU (RTX2060) instead of the CPU, cause Ive read that way is faster
Im using Elevenlabs, btw
idk bro i havent reached that point yet. then why am i commenting , idk im bored
Made a video about it on 8:33, i hope this helps!