desktop-waifu icon indicating copy to clipboard operation
desktop-waifu copied to clipboard

How to use GPU instead of CPU for faster translating?

Open carlangassnake opened this issue 1 year ago • 2 comments

I already installed CUDA and TorchPy, but dont know what I have to set and where I have to set it so the workload of the transcription goes to my GPU (RTX2060) instead of the CPU, cause Ive read that way is faster

Im using Elevenlabs, btw

carlangassnake avatar Nov 13 '23 18:11 carlangassnake

idk bro i havent reached that point yet. then why am i commenting , idk im bored

Made a video about it on 8:33, i hope this helps!

MepleYui avatar Jan 01 '24 07:01 MepleYui