InternVL icon indicating copy to clipboard operation
InternVL copied to clipboard

Does OpenGVLab/VisualPRM-8B support vllm inference?

Open waltonfuture opened this issue 9 months ago • 2 comments

thanks for your great work. I hope to use vllm to speed up the VisualPRM-8B. Does it support vllm?

waltonfuture avatar Apr 03 '25 13:04 waltonfuture

Thank you for your interest in our work. The model architecture of VisualPRM is the same with InternVL, so this model supports vllm inference. However, during the evaluation stage of VisualPRM, we use + as the placeholder and obtain the step score by one forward pass. Therefore, the inference cost of vllm and huggingface should be comparable. You can refer to our code for more details.

Weiyun1025 avatar Apr 18 '25 06:04 Weiyun1025