lmdeploy
lmdeploy copied to clipboard
[Feature] When will TurboMind backend support InternVL3.5-Flash series models?
Motivation
Currently LMDeploy only supports InternVL3.5-Flash series models with PyTorch backend, which did not bring the expected performance improvement compared with InternVL3.5 series models with TurboMind backend. When will TurboMind backend support InternVL3.5-Flash series models?
Related resources
No response
Additional context
No response