AnimateDiff
AnimateDiff copied to clipboard
Update attention.py by adding SparseCausalAttention implementation
Add SparseCausalAttention implementation to ./animatediff/models/attention.py
If you want to use SparseCausalAttention in the self-attention part, simply set unet_use_cross_frame_attention = True.
If you want to further improve the code, you can focus on the class SparseCausalAttention2D(CrossAttention).
To view the corresponding experimental results, please visit this link.