flash-attention
flash-attention copied to clipboard
Availability of wheel
What's the recommended way to install flash-attn inside a requirements.txt
file? As far as I know, pip doesn't necessarily support package installation order, so there's no way to make sure ninja
is installed before flash-attn
. Furthermore, latest version of pip doesn't support passing --no-build-isolation
inside a requirements file. I assume a wheel will solve the problem. Is there a publicly released wheel that one can use to install flash-attn
directly?