Request to add support for InternVL-2 model
It is hoped that the ollama platform can add the model InternVL-2 series.
Needs support in llama.cpp first.
https://github.com/ggerganov/llama.cpp/issues/6803
+1 ❤️
This might be a wayyyy out idea, but why not build support for other model runners into Ollama to cover models that lamacpp doesn't support. I have built apps around Ollama, and am dealing with how I might have to move away from Ollama and go to other or additional engines to support, in particular InternVL and Phi3.5 MOE models and others as they come. Why not build something like transformers support for some models, and/or make it an install option with torch as pre-req like LMDeploy does, or some other method. Ollama could be the "everything engine".
Have a look at here:
https://github.com/ggerganov/llama.cpp/pull/9403
https://github.com/qlylangyu/llama.cpp/pull/1
@sammcj Would you like to take a look at this?
Appreciate your work on https://github.com/ollama/ollama/pull/6279
@jmorganca
any updates?