helicone icon indicating copy to clipboard operation
helicone copied to clipboard

[Feature]: vLLM support

Open fpaupier opened this issue 1 year ago • 0 comments

The Feature

(vLLM)[https://github.com/vllm-project/vllm] is a LLM serving framework that enables you to expose openAI API compatible endpoint. How can I get helicone working with vLLM?

Motivation, pitch

For on prem deployment, vLLM is a great option for secure inference data handling and for full control on the model you use. Given helicone can be self deployed with docker compose / K8s, it would be a great complimentary service to have on our infra.

Twitter / LinkedIn details

https://www.linkedin.com/in/fpaupier/

fpaupier avatar Nov 27 '24 12:11 fpaupier