uptrain
uptrain copied to clipboard
llama.cpp local eval support
Is your feature request related to a problem? Please describe. Ollama is a good solution for local evaluation for projects that already use it. If a project uses pure llama.cpp instead, it seems redundant to have to use both (one for generation, one for eval).
Describe the solution you'd like llama.cpp has a web server that supports OpenAI format which should be compatible with litellm
Describe alternatives you've considered As mentioned Ollama works, but I don't want to have to download 2 models when I can share the models if I would be able to use llama.cpp for everything.
Additional context Add any other context or screenshots about the feature request here.