InternVL-Chat-V1.5 support
pleaseeeeee, InternVL-Chat-V1.5 surpasses many proprietary multi-modal models, very powerful, pleaseeeee.
Hi, Is InternVL-Chat-V1.5 suported now?
Hi, Is InternVL-Chat-V1.5 suported now?
Hi @AdamzNV @ncomly-nvidia @juney-nvidia , would please add some comments here?
We don't support this model yet. According to download statistics from Hugging Face, its popularity has been declining for the past half month.
mark
mark
We don't support this model yet. According to download statistics from Hugging Face, its popularity has been declining for the past half month.
@AdamzNV Please take a look at this ranking list. The score of the InternVL model is in the top 2. https://rank.opencompass.org.cn/leaderboard-multimodal/?m=REALTIME
Hi @AmazDeng we just enabled internvl2 supporting. do u still have further issue or question now? If not, we'll close it soon.
@nv-guomingz Wow, that looks great
Hi @AmazDeng we just enabled internvl2 supporting. do u still have further issue or question now? If not, we'll close it soon.
@nv-guomingz Thank you for your work. Could you please provide the link to the relevant documentation?
Hi @AmazDeng we just enabled internvl2 supporting. do u still have further issue or question now? If not, we'll close it soon.
@nv-guomingz Thank you for your work. Could you please provide the link to the relevant documentation?
https://github.com/NVIDIA/TensorRT-LLM/tree/main/examples/multimodal#internvl2
https://github.com/NVIDIA/TensorRT-LLM/tree/main/examples/multimodal#internvl2
@nv-guomingz I checked the documentation, and tensorrt-llm only supports InternVL2-1B to InternVL2-26B. So, does tensorrt-llm support InternVL2-40B?
https://github.com/NVIDIA/TensorRT-LLM/tree/main/examples/multimodal#internvl2
@nv-guomingz I checked the documentation, and tensorrt-llm only supports InternVL2-1B to InternVL2-26B. So, does tensorrt-llm support InternVL2-40B?
I'm not the owner of this model but I think 40B should be supported if it's just a large varaint of InternVL2.