sanshi9523

Results 3 issues of sanshi9523

vllm 0.11.0 transformers 4.57.1 torch 2.8.0 flash_attn 2.8.0 flashinfer-python 2.8.3 请问有解决办法或部署成功的吗

![Image](https://github.com/user-attachments/assets/8f7d72a6-06b6-415e-9563-c21bede8d5b7) ![Image](https://github.com/user-attachments/assets/76275658-a15c-4c04-9ffb-3a4c98432aaf) ![Image](https://github.com/user-attachments/assets/56c127b8-f901-4b9f-abbd-815a09743258) ![Image](https://github.com/user-attachments/assets/2f546e72-d749-405a-928a-182ead744fa3)