transformer_latent_diffusion icon indicating copy to clipboard operation
transformer_latent_diffusion copied to clipboard

Adding Flash attention

Open adi-kmt opened this issue 1 year ago • 1 comments

Will improve the training and inference speed by a large margin!!

adi-kmt avatar Feb 24 '24 16:02 adi-kmt

Hey the model uses https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html which should already use flash attention.

apapiu avatar Mar 06 '24 07:03 apapiu