whisper-ctranslate2
whisper-ctranslate2 copied to clipboard
Add support for `--device mps` option for MacOS
Currently, the --device flag only supports two options: cpu and cuda. However, cuda is not available on MacOS, leaving cpu as the only option, which is significantly slower.
MacOS users can leverage the Metal Performance Shaders (MPS) backend for GPU acceleration, which greatly speeds up processing.
Please add support for --device mps to enable GPU acceleration on MacOS.
As a temporary solution, I am manually hardcoding the value in the environment folder, but this is a dirty hack and not maintainable.
Adding mps support natively would improve usability and performance for MacOS users.
Can you share your hack? I'd like to try it out.
https://github.com/Softcatala/whisper-ctranslate2/blob/1a4b8ee8dfd2999f77628b9e39dd5ddf4cf7fe82/src/whisper_ctranslate2/diarization.py#L38
Here I changed self.device to "mps".
I guess this is because https://github.com/OpenNMT/CTranslate2 does not support mps?
I tried it. Simply edit diarization.py does pass compile, but it has no use.
You need to add mps in commandline.py, in the --device parameter.
And then you will get error when running with --device mps , says ValueError: unsupported device mps
at code self.model = ctranslate2.models.Whisper
So it's ctranslate2's problem indeed
@xx1adfasd yep, I was wrong, this hardcoded "mps" works for diarization only.