MiniCPM-V
MiniCPM-V copied to clipboard
💡 [REQUEST] - <title>minicpm vlm with small llm for fast inference on mobile device
起始日期 | Start Date
08/26/2024
实现PR | Implementation PR
No response
相关Issues | Reference Issues
No response
摘要 | Summary
how to add a small large language model like qwen 0.5b or mobilellm in minicpm? 1、add a trained model like qwen 0.5b or mobilellm 2、disill a large llama 7b to small one
基本示例 | Basic Example
1、add a trained model like qwen 0.5b or mobilellm 2、disill a large llama 7b to small one
缺陷 | Drawbacks
no
未解决问题 | Unresolved questions
No response