bark icon indicating copy to clipboard operation
bark copied to clipboard

Allowing to ignore GPU

Open h-h-h-h opened this issue 2 years ago • 1 comments

I get the error torch.cuda.OutOfMemoryError: CUDA out of memory. So I'd like to run on CPU. But there isn't a setting for that, even though the readme talks about being able to run on both CPU and GPU. It would be great if there was a setting to ignore the GPU to be able to avoid any errors relating to an insufficient GPU.

If technically applicable: If running on CPU wouldn't utilize all logical CPU cores by default, there should also be a setting for the number of threads as in llama.cpp, so one can get CPU utilization up to 100% to maximize speed.

h-h-h-h avatar Apr 22 '23 03:04 h-h-h-h

from bark import preload_models

preload_models(use_gpu=False)

devidw avatar Apr 22 '23 13:04 devidw