inference icon indicating copy to clipboard operation
inference copied to clipboard

[feature request] MiniCPM-2B-sft-bf16 support vllm inference

Open bao21987 opened this issue 1 year ago • 1 comments

Feature request / 功能建议

MiniCPM-2B-sft-bf16 not support vllm in xinference. It's actually support vllm officially: https://github.com/OpenBMB/MiniCPM/blob/main/inference/inference_vllm.py and https://docs.vllm.ai/en/latest/models/supported_models.html

Motivation / 动机

please add vllm support for MiniCPM-2B-sft-bf16, openbmb/MiniCPM-2B-dpo-bf16, etc.

Your contribution / 您的贡献

NA

bao21987 avatar Aug 12 '24 14:08 bao21987

OK, we will support it ASAP.

qinxuye avatar Aug 13 '24 02:08 qinxuye

This issue is stale because it has been open for 7 days with no activity.

github-actions[bot] avatar Aug 20 '24 19:08 github-actions[bot]

This issue was closed because it has been inactive for 5 days since being marked as stale.

github-actions[bot] avatar Aug 25 '24 19:08 github-actions[bot]