cannot use vllm to infer ovis2.5
Dear author, I use your code providing in readme but always repor t problems like python ovis_test/response.py
Error: HTTP 400
Response: {"error":{"message":"Invalid content type. Supported types for system and user are text, image, video. Invalid content type. Supported types for system and user are text, image, video.","type":"BadRequestError","param":null,"code":400}}
Do you know how to fix it? My model can infernece with transformer method.
+1
@JumpingRain
@elisaife 我重新从modelscope上下载了权重,就可以了
@elisaife 我重新从modelscope上下载了权重,就可以了
oh thankyou! I solve it but i encounter a new problem, do you know how to forbid enable_thinking?你知道怎么去掉思考过程吗?
@elisaife 我重新从modelscope上下载了权重,就可以了
I change it in the vllm code providing in reame, but the response still have think.