Eric Buehler
Eric Buehler
@Julio0250 could you please let me know what command(s) you used to run Llama 3.2 vision? Perhaps you did not build with Metal support?
Hi @eugenehp! > Is there a plan or roadmap for this MLLM feature, and if yes, can we join in to help deliver support of MiniCPM-o? I'm not currently focusing...
@eugenehp I've sent a collaborator invite. > I'm far away from doing a proper PR, but I would love your feedback once it's ready. Sounds great. > MiniCPM-o has a...
@ShelbyJenkins I have not been able to reproduce this on any platform (Linux, WSL, Windows), please feel free to reopen.
I think that this could be an interesting feature, but as you said, I also haven't received any issues about this. It is probably not critical, and we have other...
Ok, great. I think the script right now is fine, but we should consider support for loading directly from sentencepiece as a longer-term goal.
Hi @sgrebnov! > I'm just trying to learn whether supports_attn_softmax is required to compile vs in runtime. The reason why we need the check is that the attn_softmax implementation for...
HI @kuladeephx! What is the code/command you are using?
Hi @dinerburger - this has been resolved in recent patches. Can you please try it again to confirm?
Hi @mert-kurttutan! Sorry for not getting back. I would be happy to have prebuilt python binaries for the various packages. I think it would be very helpful for usability, but...