pip install flash-attn always happens ModuleNotFoundError: No module named 'packaging',but actually i have pip install packaging
Collecting flash-attn Using cached flash_attn-2.0.7.tar.gz (2.2 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "C:\Users\24259\AppData\Local\Programs\Python\Python311\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip. PS C:\d\code_program\python\ai> pip install flash-attn Collecting flash-attn Using cached flash_attn-2.0.7.tar.gz (2.2 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "C:\Users\24259\AppData\Local\Programs\Python\Python311\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
Try this:
pip install -U wheel
Try this:
pip install -U wheel
someone reported this worked in their environment.
but when we tried in a fresh docker/conda env, its not working. nor is installing packaging.
see https://github.com/OpenAccess-AI-Collective/axolotl/pull/426/files for the dockerfile
Try this:
pip install -U wheel
someone reported this worked in their environment.
but when we tried in a fresh docker/conda env, its not working. nor is installing packaging.
see https://github.com/OpenAccess-AI-Collective/axolotl/pull/426/files for the dockerfile
Same question here. And it is not working: (plaid) PS D:\Users\12625\PycharmProjects\plaid-main\plaid-main> pip install -U wheel Requirement already satisfied: wheel in c:\users\12625\anaconda3\envs\plaid\lib\site-packages (0.38.4) Collecting wheel Obtaining dependency information for wheel from https://files.pythonhosted.org/packages/b8/8b/31273bf66016be6ad22bb7345c37ff350276cfd46e389a0c2ac5da9d9073/wheel-0.41.2-py3-none-any.whl.metadata Using cached wheel-0.41.2-py3-none-any.whl.metadata (2.2 kB) Using cached wheel-0.41.2-py3-none-any.whl (64 kB) Installing collected packages: wheel Attempting uninstall: wheel Found existing installation: wheel 0.38.4 Uninstalling wheel-0.38.4: Successfully uninstalled wheel-0.38.4 Successfully installed wheel-0.41.2 (plaid) PS D:\Users\12625\PycharmProjects\plaid-main\plaid-main> pip install flash-attn Collecting flash-attn Using cached flash_attn-2.0.9.tar.gz (2.2 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
Traceback (most recent call last):
File "C:\Users\12625\anaconda3\envs\plaid\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
Try this:
pip install -U wheel
someone reported this worked in their environment. but when we tried in a fresh docker/conda env, its not working. nor is installing packaging. see https://github.com/OpenAccess-AI-Collective/axolotl/pull/426/files for the dockerfile
Same question here. And it is not working: (plaid) PS D:\Users\12625\PycharmProjects\plaid-main\plaid-main> pip install -U wheel Requirement already satisfied: wheel in c:\users\12625\anaconda3\envs\plaid\lib\site-packages (0.38.4) Collecting wheel Obtaining dependency information for wheel from https://files.pythonhosted.org/packages/b8/8b/31273bf66016be6ad22bb7345c37ff350276cfd46e389a0c2ac5da9d9073/wheel-0.41.2-py3-none-any.whl.metadata Using cached wheel-0.41.2-py3-none-any.whl.metadata (2.2 kB) Using cached wheel-0.41.2-py3-none-any.whl (64 kB) Installing collected packages: wheel Attempting uninstall: wheel Found existing installation: wheel 0.38.4 Uninstalling wheel-0.38.4: Successfully uninstalled wheel-0.38.4 Successfully installed wheel-0.41.2 (plaid) PS D:\Users\12625\PycharmProjects\plaid-main\plaid-main> pip install flash-attn Collecting flash-attn Using cached flash_attn-2.0.9.tar.gz (2.2 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [17 lines of output] Traceback (most recent call last): File "C:\Users\12625\anaconda3\envs\plaid\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in main() File "C:\Users\12625\anaconda3\envs\plaid\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 335, in main json_out['return_val'] = hook(**hook_input['kwargs']) File "C:\Users\12625\anaconda3\envs\plaid\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 118, in get_requires_for_build_wheel return hook(config_settings) File "C:\Users\12625\AppData\Local\Temp\pip-build-env-lwydw453\overlay\Lib\site-packages\setuptools\build_meta.py", line 355, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=['wheel']) File "C:\Users\12625\AppData\Local\Temp\pip-build-env-lwydw453\overlay\Lib\site-packages\setuptools\build_meta.py", line 325, in _get_build_requires self.run_setup() File "C:\Users\12625\AppData\Local\Temp\pip-build-env-lwydw453\overlay\Lib\site-packages\setuptools\build_meta.py", line 507, in run_setup super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script) File "C:\Users\12625\AppData\Local\Temp\pip-build-env-lwydw453\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in run_setup exec(code, locals()) File "", line 8, in ModuleNotFoundError: No module named 'packaging' [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
have you sovled it ?
Use:
pip install -U flash-attn --no-build-isolation
I experienced the same issue and it seems to be the issue with the order of installation.
In my case, I removed flash-attn from requirements.txt and ran pip install -r requirements.txt.
After installation of the other packages, then ran pip install flash-attn --no-build-isolation.
The issue here is that once you add a pyproject.toml, pip will use that and use build isolation.
To make isolation work, we would need to add to the toml:
[build-system]
requires = [
"setuptools",
"packaging",
"wheel",
"torch",
]
However that can be annoying too since it will take longer to install torch in an isolated environment, esp when it's just downloading the binary wheels anyway..
I wonder if there's a way to ignore the pyproject.toml's presence when generating the packages, so they don't try to use build isolation.
fixed by https://github.com/Dao-AILab/flash-attention/commit/73bd3f3bbb6775c5286e4b095efbc62d9fd4e5dd
Hey @tmm1, sorry to bother
Still facing the same issue:
MAX_JOBS=4 pip install -U flash-attn --no-build-isolation
Collecting flash-attn
Using cached flash_attn-2.1.0.tar.gz (2.2 MB)
Preparing metadata (pyproject.toml) ... error
error: subprocess-exited-with-error
× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [15 lines of output]
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/usr/local/lib/python3.10/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/usr/local/lib/python3.10/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 149, in prepare_metadata_for_build_wheel
return hook(metadata_directory, config_settings)
File "/usr/local/lib/python3.10/dist-packages/setuptools/build_meta.py", line 380, in prepare_metadata_for_build_wheel
self.run_setup()
File "/usr/local/lib/python3.10/dist-packages/setuptools/build_meta.py", line 487, in run_setup
super(_BuildMetaLegacyBackend,
File "/usr/local/lib/python3.10/dist-packages/setuptools/build_meta.py", line 338, in run_setup
exec(code, locals())
File "<string>", line 8, in <module>
ModuleNotFoundError: No module named 'packaging'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
The fix is out. Try:
pip install -v flash-attn==2.1.1
@tmm1 thanks, it works =)
@tridao this issue can be closed. If you want to give me issue maint privileges, I can help out keeping things tidy.
Same issue for me in v2.1.2.post3
pip install -v flash-attn==2.1.1 doesn't help either
I'm trying pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary and getting this error:
Collecting git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary
Cloning https://github.com/HazyResearch/flash-attention.git to /tmp/pip-req-build-fmhz3e3e
Running command git clone --filter=blob:none --quiet https://github.com/HazyResearch/flash-attention.git /tmp/pip-req-build-fmhz3e3e
Resolved https://github.com/HazyResearch/flash-attention.git to commit 913922cac57efd7c5e05f08155b37e74c427cf32
Running command git submodule update --init --recursive -q
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "/home/beck/.pyenv/versions/3.11.4/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/home/beck/.pyenv/versions/3.11.4/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/beck/.pyenv/versions/3.11.4/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-jskeyd_r/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 355, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-jskeyd_r/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-jskeyd_r/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 507, in run_setup
super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script)
File "/tmp/pip-build-env-jskeyd_r/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 341, in run_setup
exec(code, locals())
File "<string>", line 5, in <module>
ModuleNotFoundError: No module named 'packaging'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
pip install -v flash-attn==2.1.1doesn't help either
Did you try --no-build-isolation?
pip install -v flash-attn==2.1.1doesn't help eitherDid you try
--no-build-isolation?
Not working either
hi,is there a solution now?
pip install packaging
then
pip install flash-attn --no-build-isolation.
pip install packagingthenpip install flash-attn --no-build-isolation.
No joy on this. it was missing wheel. Installing wheel package then just errored.
This still seems to be an ongoing issue when adding to poetry.lock.
ModuleNotFoundError: No module named 'packaging'
at ~/.local/share/pypoetry/venv/lib/python3.10/site-packages/poetry/installation/chef.py:147 in _prepare
143│
144│ error = ChefBuildError("\n\n".join(message_parts))
145│
146│ if error is not None:
→ 147│ raise error from None
148│
149│ return path
150│
151│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:
Note: This error originates from the build backend, and is likely not a problem with poetry but with flash-attn (2.4.2) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "flash-attn (==2.4.2)"'.
Pip installing doesn't include freezing the dependencies. There's no guarantee of matching dependencies in a production environment with the current solution (that provided by @tridao - [thank you for sharing your solution]).
Yeah I don't have much bandwidth to spend on packaging, python packaging is kind of a mess once you involve CUDA and torch. I just use docker for reproducible environment.
The issue is still relevant when you want to use a requirement.txt file combined with a build system. Can we add packaging and ninja properly to the dependencies of flash-attn so that pip can manage the dependencies. This would significantly simplify the installation process.
Indeed, --no-build-isolation isn't allowed in requirement.txt in the most recent version of pip (23)
requirement.txt user will have to manually pick a wheel url instead of declaring flash-attn==2.5.2
e.g. In my requirement.txt:
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.2/flash_attn-2.5.2+cu118torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
Despite being manual, this is more flexible since setup.py picks the wheel based on CUDA, torch version of the machine where the build happens. IRL, you might build and packaged in build farm machines that doesn't have the same CUDA version as your worker machines.
I'm definitely not an expert on this topic. But couldn't this problem be solved by publishing the .whl files to pypi.org? Compare e.g. vllms index (https://pypi.org/simple/vllm/) with the index of flash-attention (https://pypi.org/simple/flash-attn/). The former contains .whl files, the latter does not...
PyPI has a file size limit of 60MB.
Has the error with poetry.lock been fixed? I am still facing issues when trying to install flash_attn with Poetry:
~/GitHub/test-vllm$ poetry add flash_attn
Using version ^2.5.8 for flash-attn
Updating dependencies
Resolving dependencies... (0.8s)
Package operations: 1 install, 0 updates, 0 removals
- Installing flash-attn (2.5.8): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
Traceback (most recent call last):
File "/home/ubuntu/.local/share/pipx/venvs/poetry/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 373, in <module>
main()
File "/home/ubuntu/.local/share/pipx/venvs/poetry/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 357, in main
json_out["return_val"] = hook(**hook_input["kwargs"])
File "/home/ubuntu/.local/share/pipx/venvs/poetry/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 134, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/tmpdgvzy9wj/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
File "/tmp/tmpdgvzy9wj/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
self.run_setup()
File "/tmp/tmpdgvzy9wj/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 487, in run_setup
super().run_setup(setup_script=setup_script)
File "/tmp/tmpdgvzy9wj/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 311, in run_setup
exec(code, locals())
File "<string>", line 9, in <module>
ModuleNotFoundError: No module named 'packaging'
at ~/.local/share/pipx/venvs/poetry/lib/python3.10/site-packages/poetry/installation/chef.py:164 in _prepare
160│
161│ error = ChefBuildError("\n\n".join(message_parts))
162│
163│ if error is not None:
→ 164│ raise error from None
165│
166│ return path
167│
168│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:
Note: This error originates from the build backend, and is likely not a problem with poetry but with flash-attn (2.5.8) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "flash-attn (==2.5.8)"'.
The solution (required for older torch versions, e.g. 2.0.1) was to correct the obsolete way of importing packaging module from pkg_resources (distributed with setuptools and not to be confused with the packaging Python package) to support newer major versions of setuptools (70.0.0 and above). So upgrading to the latest torch is the preferred solution now.
A quick and dirty workaround allowing us to install flash-attn without risking torch upgrades is to downgrade setuptools:
pip install setuptools"<70.0.0" && pip install flash-attn # --no-build-isolation is not relevant to this issue
More info (from build logs):
Collecting flash-attn
Downloading flash_attn-2.5.8.tar.gz (2.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 101.1 MB/s eta 0:00:00
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [9 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-fwyobwo6/flash-attn_86cb3e09b5d749c792dbc302c1822f5f/setup.py", line 20, in <module>
from torch.utils.cpp_extension import (
File "/opt/conda/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 25, in <module>
from pkg_resources import packaging # type: ignore[attr-defined]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ImportError: cannot import name 'packaging' from 'pkg_resources' (/opt/conda/lib/python3.11/site-packages/pkg_resources/__init__.py)
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
pip3 install -U pip
pip3 install packaging
above commands work to me