llava-cpp-server
llava-cpp-server copied to clipboard
CMake Build?
Hello, is it possible to compile with cmake? Because with make it doesn't detect cuda
Not planned but I would accept a PR if you do it. I haven't touched this code since I released it and since then, llama.cpp has added support for LLaVA along with an API interface. I would advise looking at that project if you really need this capability.
Ok thank you, unfortunately llama.cpp has dropped support for multimodal models for the server