Eagle
Eagle copied to clipboard
ValueError: SiglipVisionModel does not support Flash Attention 2.0 yet
when use th 2B 1B mdoelname ,report this error.
ValueError: SiglipVisionModel does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co//discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
+1
Are there any updates regarding this?