MiniCPM-V
MiniCPM-V copied to clipboard
[BUG]ollama运行2.6版本,聊天报错:llama_get_logits_ith: invalid logits id 10, reason: no logits
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- [X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
mac本地ollama部署2.6版本,按照文档部署启动ollama服务,聊天时报错:llama_get_logits_ith: invalid logits id 10, reason: no logits,请问这个问题有解决吗?
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:macos
- Python:3.11
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):
备注 | Anything else?
No response
check #393 , openBMB's work is merging to master, you may need to compile ollama from source
直接用主分支?我试试
直接用主分支?我试试
please check the latest comment in #393 , start from master and merge openBMB's fork is needed, you can also use my precompile and docker image temporarily
update: fell free to check my fork https://github.com/luixiao0/ollama
@tc-mb 解决了请通知一下,谢谢!
@tc-mb 解决了请通知一下,谢谢!
好的,我会及时通知。
@tc-mb 解决了吗?好长时间了
遇到同样问题,mark一下
现在ollama应该支持了。 可以再试用下。