uptrain icon indicating copy to clipboard operation
uptrain copied to clipboard

llama.cpp local eval support

Open lbux opened this issue 2 months ago • 0 comments

Is your feature request related to a problem? Please describe. Ollama is a good solution for local evaluation for projects that already use it. If a project uses pure llama.cpp instead, it seems redundant to have to use both (one for generation, one for eval).

Describe the solution you'd like llama.cpp has a web server that supports OpenAI format which should be compatible with litellm

Describe alternatives you've considered As mentioned Ollama works, but I don't want to have to download 2 models when I can share the models if I would be able to use llama.cpp for everything.

Additional context Add any other context or screenshots about the feature request here.

Thank you for your feature request - We love adding them

lbux avatar Apr 17 '24 22:04 lbux