MiniCPM icon indicating copy to clipboard operation
MiniCPM copied to clipboard

[Bad Case]: RuntimeError: The detected CUDA version (11.8) mismatches the version that was used to compile PyTorch (12.1). Please make sure to use the same CUDA versions.

Open sunbeibei-hub opened this issue 1 year ago • 1 comments

Description / 描述

run:

pip install inference/vllm

RuntimeError:
      The detected CUDA version (11.8) mismatches the version that was used to compile
      PyTorch (12.1). Please make sure to use the same CUDA versions.
      

linux cuda version is:

(bei_MiniCPM) [search@search-chatGLM-02 MiniCPM]$ pip show torch WARNING: Package(s) not found: torch (bei_MiniCPM) [search@search-chatGLM-02 MiniCPM]$ nvcc --version nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2022 NVIDIA Corporation Built on Wed_Sep_21_10:33:58_PDT_2022 Cuda compilation tools, release 11.8, V11.8.89 Build cuda_11.8.r11.8/compiler.31833905_0

please, tell me 0.2.2 vllm do not work cuda 11.8????

Case Explaination / 案例解释

i will try vllm guide!

sunbeibei-hub avatar Feb 04 '24 10:02 sunbeibei-hub

Related issue in VLLM repo https://github.com/vllm-project/vllm/issues/2219

a710128 avatar Feb 04 '24 10:02 a710128

A simple solution with conda env is conda install pytorch=<your_version> pytorch-cuda=12.1 -c pytorch -c nvidia

SwordFaith avatar Feb 09 '24 14:02 SwordFaith