flash-attention
flash-attention copied to clipboard
ModuleNotFoundError: No module named 'torch'
torch actually installed
try pip install flash-attn --no-build-isolation. it works for me
this cant be done within a setup.py file in my project...
Maybe regressing to older version of flash-attn is sufficient. I succeed with flash-attn==1.0.4
I'm getting below with --no-build-isolation as well as with 1.0.4.
ModuleNotFoundError: No module named 'torch.utils'
We recommend the Pytorch container from Nvidia, which has all the required tools to install FlashAttention.
I'm getting below with
--no-build-isolationas well as with 1.0.4.ModuleNotFoundError: No module named 'torch.utils'
Looks like the issue was that my anaconda install was in /anaconda and therefore required sudo. After reinstalling anaconda in ~/, --no-build-isolation is working now.
Also, I installed Pytorch nighly build that works with CUDA 12.0.
It would be GREAT to retain ability to build things outside one specific container!
Seeing ModuleNotFoundError: No module named 'torch' during an install is probably because the setup.py is technically incorrect.
python needs more details about dependencies during build time and it's not being threaded through the entire project definition (and it's not great/safe to be calling other installed libraries during install time, etc).
It looks like this is project borrowed a copy of the broken detectron2 setup script which they also haven't fixed for years so the infrastructure contagion is spreading 🫠
also see:
- https://github.com/python-poetry/poetry/issues/2113
- https://github.com/python-poetry/poetry/issues/2113#issuecomment-1145937065
- https://github.com/python-poetry/poetry/issues/2113#issuecomment-1221601986
I had to remove pyproject.toml for now since I couldn't find a way to add torch as a build dependencies that work for everyone. Hopefully installation would work for the newest version (1.0.8).
Hi, the installation doesn't work for me with 1.0.8. Same error
ModuleNotFoundError: No module named 'torch'
--no-build-isolation works for me with 1.0.8, torch==2.0.0+cu117.
This issue is a dupe of #246.