flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

Unable to build wheel of flash_attn

Open Zer0TheObserver opened this issue 8 months ago • 1 comments

I'm using CUDA 12.3.r12.3 OS: Windows11 21H2

The process stock in: (from sympy->torch->flash_attn) (1.3.0) Using cached einops-0.8.0-py3-none-any.whl (43 kB) Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py) I wait for few hours.

Zer0TheObserver avatar Jun 25 '24 12:06 Zer0TheObserver