lmm.cpp
lmm.cpp copied to clipboard
Thank you for you work!
I am super excited to be able to use LLaVa.cpp when it is available. I know you are busy working and maintaining other ggml projects. And please let us know when you are thinking to work on this and have a working prototype for us to test.
Thanks again!
Hi @lhr0909, thanks for reaching out!
Yes I was busy with some contractual work and other higher-priority GGML PRs, mostly for GGUF. Now that it' landed, I'm ready for giving it a go --hopefully tomorrow, and hope that it'll be ready for a test this week.
LLaVA will be the first supported model, and I also hope to support HF's new Idefics next.
seems yes ... look to pull
I'll implement this in llama.cpp, --will be ready this week.
It turned out to be easier to start in llama.cpp, then maybe we can move it to this dedicated repo for more features, models etc.
WOW ... can't wait to check it ;D
Just wanted to say @monatis I'm having a great time using llava inside of llama. I also played with your clip.cpp.