faster-whisper icon indicating copy to clipboard operation
faster-whisper copied to clipboard

Is it possible to use python cuda libraries from a virtual env? Origianal whisper is super easy to install since it doesn't require me to change my system cuda version and simply pulls in the needed libs using pip.

Open Gobz opened this issue 1 year ago • 23 comments

Gobz avatar Apr 15 '23 17:04 Gobz

Currently you can already make use of these libraries but you need to manually set the environment variable LD_LIBRARY_PATH before running Python. I verified that the following works:

pip install nvidia-cublas-cu11 nvidia-cudnn-cu11

export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

(Note that this only works on Linux systems.)

I will look if we can load these libraries automatically when they are installed. It's an improvement we should make in the underlying CTranslate2 library.

guillaumekln avatar Apr 16 '23 09:04 guillaumekln

Currently you can already make use of these libraries but you need to manually set the environment variable LD_LIBRARY_PATH before running Python. I verified that the following works:

pip install nvidia-cublas-cu11 nvidia-cudnn-cu11

export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

(Note that this only works on Linux systems.)

I will look if we can load these libraries automatically when they are installed. It's an improvement we should make in the underlying CTranslate2 library.

Is it possible to get it to work with VENV on Windows by any chance? I have cuda 11.8, cudnn 11.x all installed properly, but it's not working.

Could not load library cudnn_cnn_infer64_8.dll. Error code 126
Please make sure cudnn_cnn_infer64_8.dll is in your library path!

would installing pytorch help?

What's really weird is that a few days ago, it worked fine without any CUDA, CUDNN, zlib installation. After a clean install of windows, it doesn't work.

edited: The same error occurs even in a non-virtual environment. I did everything, including cuda 11.8 installation, cudnn 11.x installation, zlib installation, and path addition, but I don't know why this is happening.

hoonlight avatar Jun 10 '23 15:06 hoonlight

This solution does not work on Windows because NVIDIA only provides Linux binaries for the cuDNN package:

https://pypi.org/project/nvidia-cudnn-cu11/#files

Installing PyTorch will not help in this case.

I did everything, including cuda 11.8 installation, cudnn 11.x installation, zlib installation, and path addition, but I don't know why this is happening.

Maybe you should double-check the PATH setting. I know it can be tricky to get it right.

You could also look at @Purfview's standalone executable: https://github.com/Purfview/whisper-standalone-win. There you can download the NVIDIA libraries and simply put them in the same directory as the executable.

guillaumekln avatar Jun 12 '23 08:06 guillaumekln

This solution does not work on Windows because NVIDIA only provides Linux binaries for the cuDNN package:

https://pypi.org/project/nvidia-cudnn-cu11/#files

Installing PyTorch will not help in this case.

I did everything, including cuda 11.8 installation, cudnn 11.x installation, zlib installation, and path addition, but I don't know why this is happening.

Maybe you should double-check the PATH setting. I know it can be tricky to get it right.

You could also look at @Purfview's standalone executable: https://github.com/Purfview/whisper-standalone-win. There you can download the NVIDIA libraries and simply put them in the same directory as the executable.

Thanks, I'll try again to make sure I'm not missing anything. I just have one question: is the installation of cuda, cudnn "essential" for running faster-whisper on a gpu? I don't remember installing cudnn (before the windows reinstall), but only because device="cuda" just worked fine. If they are required to be installed, my memory is probably wrong.

hoonlight avatar Jun 12 '23 13:06 hoonlight

Yes these libraries are required for GPU execution. An error is raised when you try to use the GPU but these libraries cannot be found.

guillaumekln avatar Jun 12 '23 13:06 guillaumekln

According to this solution, I get the code below:

try:
    import os
    import nvidia.cublas.lib
    import nvidia.cudnn.lib

    cublas_path = os.path.dirname(nvidia.cublas.lib.__file__)
    cudnn_path = os.path.dirname(nvidia.cudnn.lib.__file__)
    os.environ["LD_LIBRARY_PATH"] = f"{cublas_path}:{cudnn_path}"
except ModuleNotFoundError:
    pass

But still get same error, like the PATH is not set.

yangyaofei avatar Jun 13 '23 08:06 yangyaofei

The environment variable LD_LIBRARY_PATH should be set before starting the Python process.

guillaumekln avatar Jun 13 '23 08:06 guillaumekln

@guillaumekln oh, thank you to explain this. It seems like the once for all solution is not easy😂.

yangyaofei avatar Jun 13 '23 08:06 yangyaofei

I'm pretty certain that NVIDIA offers cuDNN files of some sort for Windows at https://developer.nvidia.com/rdp/cudnn-download. I do remember getting faster-whisper to work on the GPU on Windows but I do remember it being quite the hassle, either due to my inexperience or genuine difficulty.

Feanix-Fyre avatar Jun 13 '23 21:06 Feanix-Fyre

Yes, you can download cuDNN binaries for Windows on the NVIDIA website.

But this issue is about installing cuDNN via PyPI with:

pip install nvidia-cudnn-cu11

This does not work on Windows.

guillaumekln avatar Jun 13 '23 21:06 guillaumekln

If you're on Windows, a functioning workaround I've found is to install torch with cuda support, then add the "lib" subfolder to your PATH.

It works since the lib folder contains DLLs for both cuBLAS and cuDNN.

lugia19 avatar Jul 03 '23 13:07 lugia19

The trick in https://github.com/guillaumekln/faster-whisper/issues/153#issuecomment-1510218906 is not working for me, what am I doing wrong?

(faster-whisper) vadi@barbar:~$ pip install nvidia-cublas-cu11 nvidia-cudnn-cu11
Requirement already satisfied: nvidia-cublas-cu11 in ./Programs/miniconda3/envs/faster-whisper/lib/python3.10/site-packages (11.11.3.6)
Requirement already satisfied: nvidia-cudnn-cu11 in ./Programs/miniconda3/envs/faster-whisper/lib/python3.10/site-packages (8.9.2.26)
(faster-whisper) vadi@barbar:~$ export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`
(faster-whisper) vadi@barbar:~$ echo $LD_LIBRARY_PATH 
/home/vadi/Programs/miniconda3/envs/faster-whisper/lib/python3.10/site-packages/nvidia/cublas/lib:/home/vadi/Programs/miniconda3/envs/faster-whisper/lib/python3.10/site-packages/nvidia/cudnn/lib
(faster-whisper) vadi@barbar:~$ python ~/Downloads/faster-whisper.py 
Traceback (most recent call last):
  File "/home/vadi/Downloads/faster-whisper.py", line 6, in <module>
    model = WhisperModel(model_size, device="cuda", compute_type="float16")
  File "/home/vadi/Programs/miniconda3/envs/faster-whisper/lib/python3.10/site-packages/faster_whisper/transcribe.py", line 124, in __init__
    self.model = ctranslate2.models.Whisper(
RuntimeError: CUDA failed with error unknown error
(faster-whisper) vadi@barbar:~$ 

Running Nvidia driver 535.54.03 on RTX 4080 on Ubuntu 22.04.

vadi2 avatar Aug 01 '23 16:08 vadi2

This is probably another issue. I think it happens when the GPU driver is not loaded correctly (e.g. it was just updated to a new version). Rebooting the system will often fix this type of error.

guillaumekln avatar Aug 02 '23 05:08 guillaumekln

It did. Thanks!

vadi2 avatar Aug 02 '23 06:08 vadi2

nvidia.cublas.lib.__file__

why my nvidia.cublas.lib.file attribute is None so the environment variable failed to set,and when i run faster_whisper and will encounter the error "Could not load library libcudnn_ops_infer.so.8. Error: libcudnn_ops_infer.so.8: cannot open shared object file: No such file or directory Please make sure libcudnn_ops_infer.so.8 is in your library path!"

Simon-chai avatar Sep 20 '23 10:09 Simon-chai

The error is about cuDNN, not cuBLAS.

You should double-check that you correct installed the pip packages as shown in https://github.com/guillaumekln/faster-whisper/issues/153#issuecomment-1510218906.

guillaumekln avatar Sep 21 '23 11:09 guillaumekln

Currently you can already make use of these libraries but you need to manually set the environment variable LD_LIBRARY_PATH before running Python. I verified that the following works:

pip install nvidia-cublas-cu11 nvidia-cudnn-cu11

export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

(Note that this only works on Linux systems.)

I will look if we can load these libraries automatically when they are installed. It's an improvement we should make in the underlying CTranslate2 library.

Doesn't work for me on Ubuntu (WSL).

__file__ doesn't exist:

>>> print(nvidia.cublas.lib.__file__)
None

But the libraries are installed:

andy@work:~$ pip install nvidia-cublas-cu11 nvidia-cudnn-cu11
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: nvidia-cublas-cu11 in ./.local/lib/python3.8/site-packages (11.10.3.66)
Requirement already satisfied: nvidia-cudnn-cu11 in ./.local/lib/python3.8/site-packages (8.5.0.96)
Requirement already satisfied: setuptools in /usr/local/lib/python3.8/dist-packages (from nvidia-cublas-cu11) (68.2.2)
Requirement already satisfied: wheel in ./.local/lib/python3.8/site-packages (from nvidia-cublas-cu11) (0.40.0)
>>> import os; import nvidia.cublas.lib; import nvidia.cudnn.lib;
>>> print(nvidia.cublas.lib)
<module 'nvidia.cublas.lib' (namespace)>

Setting the paths manually like so worked:

export LD_LIBRARY_PATH="$HOME/.local/lib/python3.8/site-packages/nvidia/cublas/lib/:$HOME/.local/lib/python3.8/site-packages/nvidia/cudnn/lib/"

s-h-a-d-o-w avatar Oct 06 '23 18:10 s-h-a-d-o-w

Currently you can already make use of these libraries but you need to manually set the environment variable LD_LIBRARY_PATH before running Python. I verified that the following works:

pip install nvidia-cublas-cu11 nvidia-cudnn-cu11

export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

(Note that this only works on Linux systems.)

I will look if we can load these libraries automatically when they are installed. It's an improvement we should make in the underlying CTranslate2 library.

Very cool, thanks for the tip! Btw if you're using a conda env you can set the env var like this (in your environment):

conda env config vars set LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

It will overwrite the default path

gu-ma avatar Nov 30 '23 20:11 gu-ma

This solution does not work on Windows because NVIDIA only provides Linux binaries for the cuDNN package: https://pypi.org/project/nvidia-cudnn-cu11/#files Installing PyTorch will not help in this case.

I did everything, including cuda 11.8 installation, cudnn 11.x installation, zlib installation, and path addition, but I don't know why this is happening.

Maybe you should double-check the PATH setting. I know it can be tricky to get it right. You could also look at @Purfview's standalone executable: https://github.com/Purfview/whisper-standalone-win. There you can download the NVIDIA libraries and simply put them in the same directory as the executable.

Thanks, I'll try again to make sure I'm not missing anything. I just have one question: is the installation of cuda, cudnn "essential" for running faster-whisper on a gpu? I don't remember installing cudnn (before the windows reinstall), but only because device="cuda" just worked fine. If they are required to be installed, my memory is probably wrong.

Error in Windows

....
RuntimeError: Library cublas64_11.dll is not found or cannot be loaded

download https://github.com/Purfview/whisper-standalone-win/releases/download/libs/cuBLAS.and.cuDNN_win_v3.zip unizip *.dll in project path add lines in the app:

import ctypes
cublas64_11 = ctypes.WinDLL('.\cublas64_11.dll')

pauloboritza avatar Dec 13 '23 01:12 pauloboritza

If you're on Windows, a functioning workaround I've found is to install torch with cuda support, then add the "lib" subfolder to your PATH.

It works since the lib folder contains DLLs for both cuBLAS and cuDNN.

It's been a while, but what command did you use to install torch specifically?

santiago-afonso avatar Feb 07 '24 12:02 santiago-afonso

If you're on Windows, a functioning workaround I've found is to install torch with cuda support, then add the "lib" subfolder to your PATH. It works since the lib folder contains DLLs for both cuBLAS and cuDNN.

It's been a while, but what command did you use to install torch specifically?

Oh, it was just the usual pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 type deal.

Just installing torch with CUDA.

lugia19 avatar Feb 07 '24 12:02 lugia19

Currently you can already make use of these libraries but you need to manually set the environment variable LD_LIBRARY_PATH before running Python. I verified that the following works:

pip install nvidia-cublas-cu11 nvidia-cudnn-cu11

export LD_LIBRARY_PATH=`python3 -c 'import os; import nvidia.cublas.lib; import nvidia.cudnn.lib; print(os.path.dirname(nvidia.cublas.lib.__file__) + ":" + os.path.dirname(nvidia.cudnn.lib.__file__))'`

(Note that this only works on Linux systems.)

I will look if we can load these libraries automatically when they are installed. It's an improvement we should make in the underlying CTranslate2 library.

Hi @guillaumekln , may I know if this improvement has been made on CTranslate2? I'm using CUDA 12.1 and I'm facing the same issue. It works by exporting the LD_LIBRARY_PATH env variable though.

Happy to help for anything you'd need.

louistiti avatar May 21 '24 15:05 louistiti