Open-Sora
Open-Sora copied to clipboard
Flash Attention Alternative for Older GPU's
Please consider enabling the Use of pytorch's scaled_dot_product_attention as an alternative for those with older GPU's
See this example for another product. https://github.com/HiDream-ai/HiDream-I1/pull/27