TurtlesAI
Results
2
comments of
TurtlesAI
is flash_Attention for 5090 and windows available?
> try mine: > > [#1683](https://github.com/Dao-AILab/flash-attention/issues/1683) Thank you I tried but then I have another huge issue there....the latest pytorch compatible with blackwell is 2.7 with cu128...How can I use...