whisper-ctranslate2 icon indicating copy to clipboard operation
whisper-ctranslate2 copied to clipboard

Add support for `--device mps` option for MacOS

Open ozio opened this issue 11 months ago • 5 comments

Currently, the --device flag only supports two options: cpu and cuda. However, cuda is not available on MacOS, leaving cpu as the only option, which is significantly slower.

MacOS users can leverage the Metal Performance Shaders (MPS) backend for GPU acceleration, which greatly speeds up processing.

Please add support for --device mps to enable GPU acceleration on MacOS.

As a temporary solution, I am manually hardcoding the value in the environment folder, but this is a dirty hack and not maintainable.

Adding mps support natively would improve usability and performance for MacOS users.

ozio avatar Dec 17 '24 02:12 ozio

Can you share your hack? I'd like to try it out.

jeeftor avatar Dec 20 '24 18:12 jeeftor

https://github.com/Softcatala/whisper-ctranslate2/blob/1a4b8ee8dfd2999f77628b9e39dd5ddf4cf7fe82/src/whisper_ctranslate2/diarization.py#L38

Here I changed self.device to "mps".

ozio avatar Dec 21 '24 04:12 ozio

I guess this is because https://github.com/OpenNMT/CTranslate2 does not support mps?

unclefomotw avatar Jan 07 '25 12:01 unclefomotw

I tried it. Simply edit diarization.py does pass compile, but it has no use.

You need to add mps in commandline.py, in the --device parameter.

And then you will get error when running with --device mps , says ValueError: unsupported device mps

at code self.model = ctranslate2.models.Whisper

So it's ctranslate2's problem indeed

ghost avatar May 15 '25 15:05 ghost

@xx1adfasd yep, I was wrong, this hardcoded "mps" works for diarization only.

ozio avatar May 16 '25 02:05 ozio