bark
bark copied to clipboard
Allowing to ignore GPU
I get the error torch.cuda.OutOfMemoryError: CUDA out of memory. So I'd like to run on CPU. But there isn't a setting for that, even though the readme talks about being able to run on both CPU and GPU. It would be great if there was a setting to ignore the GPU to be able to avoid any errors relating to an insufficient GPU.
If technically applicable: If running on CPU wouldn't utilize all logical CPU cores by default, there should also be a setting for the number of threads as in llama.cpp, so one can get CPU utilization up to 100% to maximize speed.
from bark import preload_models
preload_models(use_gpu=False)