Qwen-VL
Qwen-VL copied to clipboard
[BUG] <'Only Support Self-Attention Currently' Assert Error>
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- [X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
visual.py:192 assert Error
assert torch.allclose(query, key), 'Only Support Self-Attention Currently'
I use qwen-vl-7b-int4 model for few_shot inference, get this error, I guess this is precision problem.
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:Ubuntu 20.04
- Python:3.10.14
- Transformers:4.32.0
- PyTorch:2.1.2
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):12.1
备注 | Anything else?
No response