SoniTranslate icon indicating copy to clipboard operation
SoniTranslate copied to clipboard

Control working mode: cpu/cuda

Open 0zhu opened this issue 2 months ago • 1 comments

Hi guys,

Can someone please suggest how to effectively control the working mode?

app_rvc.py is automatically started in cuda mode:

[INFO] >> Working in: cuda

However I have a quite old MX150 GPU, and it constantly fails with a CUDA out of memory on Transcribing stage, no matter how I tweak Batch size / Compute type / Whisper ASR model or PYTORCH_CUDA_ALLOC_CONF.

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 78.00 MiB. GPU 0 has a total capacity of 2.00 GiB of which 0 bytes is free. Of the allocated memory 101.51 MiB is allocated by PyTorch, and 72.49 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)

Therefore I would like to fallback to cpu mode.

Running SoniTranslate dev_24_3 on Windows 11.

Thanks!

0zhu avatar Apr 18 '24 10:04 0zhu