flash-attention
flash-attention copied to clipboard
Unable to build wheel of flash_attn
I'm using CUDA 12.3.r12.3 OS: Windows11 21H2
The process stock in:
(from sympy->torch->flash_attn) (1.3.0) Using cached einops-0.8.0-py3-none-any.whl (43 kB) Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py)
I wait for few hours.