transformers
transformers copied to clipboard
Add siglip flashattention support?
Add siglip flashattention support?
I noticed that the navit-flashatten-siglip version already added flash attention, would transformers add it support inside ?
cc @molbap or @qubvel maybe :)
these two guys looked but didn't response.
@lucasjinreal thanks for the feature request, I will take a look at it!
thanks, since more and more MLLM used siglip as vision encoder, if flash attention support would be very reduce training cost time