Purshow
Purshow
same question,have you solved it?
> reinstall transformers==4.38.2 Thanks! It works for me.
> 我收到以下**错误**: > > MultiModalityCausalLM 尚不支持 Flash Attention 2.0。 hey! I have the same question. Have you done it?
> [@WalkerWorldPeace](https://github.com/WalkerWorldPeace) 您好这是哪一个模型评测的结果呢,我们这边尝试复现一下 我在测试 https://huggingface.co/HaochenWang/ross-qwen2-7b 时碰到了类似的问题