flash-attention
flash-attention copied to clipboard
Why is torch2.9.0 not supported?
can you create a PR?
@tridao https://github.com/Dao-AILab/flash-attention/pull/2007
@Jackmin801 @tridao we skip 2.9.0 due that is unstable. 2.9.1 fixed critical things and it is out from yesterday https://github.com/Dao-AILab/flash-attention/pull/1996
When can we expect the next version to be released?
+1 to wondering approximately how far out the next official release might be
+1 to wondering approximately how far out the next official release might be
i supposed when fa4 be ready