faster-whisper
faster-whisper copied to clipboard
support for running through docker
Due to requirement of specific cuda version and other libraries , it would be nice if we add support for running faster-whisper through a docker container as it would save all the hassle of setting up the environment for new users.
can use base image pytorch/pytorch:2.1.0-cuda12.1-cudnn8-runtime
for the cuda version suitable for you
then simply install faster-whisper
and onnxruntime-gpu
will be good to go
@qxprakash , hello. You can refer Dockerfile in this comment.