SeongHee Hong
SeongHee Hong
使用vllm推理InternVL3-8B-hf时返回ValueError: `limit_mm_per_prompt` is only supported for multimodal models.
@Kuangdd01 Hi, first of all, thank you so much for providing the fine-tuning code for InternVL3. I really appreciate your work and contribution to the open-source community. I have **fine-tuned...
使用vllm推理InternVL3-8B-hf时返回ValueError: `limit_mm_per_prompt` is only supported for multimodal models.
First of all, thank you very much for the fast and helpful response. I followed your advice and successfully completed the conversion using the following command: Step 1: convert the...
使用vllm推理InternVL3-8B-hf时返回ValueError: `limit_mm_per_prompt` is only supported for multimodal models.
As you suggested, I first exported my fine-tuned model after merging the LoRA adapter using LLaMA Factory. After that, I proceeded with the steps I mentioned earlier (Step 1–3). I...
使用vllm推理InternVL3-8B-hf时返回ValueError: `limit_mm_per_prompt` is only supported for multimodal models.
Thank you so much for your helpful comment and guidance. In addition to what you mentioned, I also found that deleting the `model.safetensors.index.json` file from the `"OpenGVLab/InternVL3-8B_after_replacing"` weight directory was...