MiniCPM-V
MiniCPM-V copied to clipboard
[BUG] <title>ollama官方文档进行的编译部署,无法与模型对话
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- [X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
当我进行对话的时候会出现Error: an unknown error was encountered while running the model
(torch) orangepi@orangepi5b:~/project/ollama$ ./ollama run minicpm2.6
你好 Error: an unknown error was encountered while running the model (torch) orangepi@orangepi5b:~/project/ollama$
![]()
期望行为 | Expected Behavior
应该有正常的对话
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:Ubuntu22.04
- Python:3.11
- Transformers:4.44.0
- PyTorch:2.4.0
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):No
备注 | Anything else?
No response
我也是,需要使用他們提供的ollama版本,然後重新編譯他們提供的ollama,才能運行他們的miniCPM,非常的複雜
Sorry, you may need to compile the code from our fork and enable the ollama service in the next few weeks to use it normally. We will submit a PR to ollama as soon as possible, but before that, you can use this readme for trial use. https://github.com/OpenBMB/ollama/blob/minicpm-v2.6/examples/minicpm-v2.6/README.md
就是根据那个飞书上的步骤,用的v2.6的fork编译的,但是运行不了
same question
同样有这个问题,按照教程一路走下来的。
官方已提供 https://ollama.com/library/minicpm-v:latest