flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

"ModuleNotFoundError: No module named 'torch'" while installing from pip

Open alex4321 opened this issue 2 years ago • 6 comments

Hi. I have a Windows 10 machine with Conda installation on it:

(llama) C:\Users\alex4321>conda --version
conda 23.3.1

I have a Conda environment with Python:

(llama) C:\Users\alex4321>python --version
Python 3.11.4

Torch were installed by the following command:

(llama) conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia

But when I try install this library I am getting:

(llama) C:\Users\alex4321>python -m pip install flash-attn
Collecting flash-attn
  Using cached flash_attn-1.0.8.tar.gz (2.0 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [18 lines of output]
      Traceback (most recent call last):
        File "C:\Users\alex4321\anaconda3\envs\llama\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\Users\alex4321\anaconda3\envs\llama\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\alex4321\anaconda3\envs\llama\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\alex4321\AppData\Local\Temp\pip-build-env-k1ihydf0\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\alex4321\AppData\Local\Temp\pip-build-env-k1ihydf0\overlay\Lib\site-packages\setuptools\build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "C:\Users\alex4321\AppData\Local\Temp\pip-build-env-k1ihydf0\overlay\Lib\site-packages\setuptools\build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 13, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

Despite the fact Torch is installed and importable:

(llama) C:\Users\alex4321>python -m pip show torch
Name: torch
Version: 2.0.1
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org/
Author: PyTorch Team
Author-email: [email protected]
License: BSD-3
Location: C:\Users\alex4321\anaconda3\envs\llama\Lib\site-packages
Requires: filelock, jinja2, networkx, sympy, typing-extensions
Required-by: torchaudio, torchvision
(llama) C:\Users\alex4321>python
Python 3.11.4 | packaged by Anaconda, Inc. | (main, Jul  5 2023, 13:47:18) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.Tensor([1.1]).cuda() * 1.1
tensor([1.2100], device='cuda:0')

alex4321 avatar Jul 13 '23 18:07 alex4321

And seeing the setup.py script:

# Adapted from https://github.com/NVIDIA/apex/blob/master/setup.py
import sys
import warnings
import os
import re
import ast
from pathlib import Path
from packaging.version import parse, Version

from setuptools import setup, find_packages
import subprocess

import torch
from torch.utils.cpp_extension import BuildExtension, CppExtension, CUDAExtension, CUDA_HOME

especially

from packaging.version import parse, Version
...
import torch
from torch.utils.cpp_extension import BuildExtension, CppExtension, CUDAExtension, CUDA_HOME

Is it even good idea to import non-standart libraries in setup.py script (especially before dependencies installation)?

alex4321 avatar Jul 13 '23 18:07 alex4321

Can you try with pip install --no-build-isolation flash-attn? This code is written as a Pytorch extension so we need Pytorch to compile.

tridao avatar Jul 13 '23 19:07 tridao

Well, taking a long time this way so it seems like it at least start actual compilation. I will write update with a final status later.

alex4321 avatar Jul 13 '23 19:07 alex4321

Compiled successfully, thanks.

alex4321 avatar Jul 13 '23 19:07 alex4321

This issue is a dupe of #246.

jaraco avatar Mar 26 '24 19:03 jaraco