langtrace icon indicating copy to clipboard operation
langtrace copied to clipboard

Feature-Request: Support LiteLLM

Open VfBfoerst opened this issue 1 year ago • 5 comments

Hey :)
I wanted to ask for an integration of litellm as we use it as a gateway to different ai providers. It would be very cool to trace the litellm requests. Is there already a way to do so? Or can it be implemented?

VfBfoerst avatar Jul 30 '24 13:07 VfBfoerst

Hey @VfBfoerst , we are working on litellm and we are hoping to release that support soon. Probably in about a week from now. I will keep you posted. Are there any specific litellm calls you are hoping to capture? would love to learn more so it meets your expectations.

karthikscale3 avatar Jul 30 '24 15:07 karthikscale3

Awesome! I would like to add langtrace as a somewhat middleware to the litellm proxy. So all tokens/metrics should be traced. Maybe on a api key basis to get statistics/metrics per key. All that to detect in the end bottlenecks, track usage, detect dysfunctional parts.

VfBfoerst avatar Jul 30 '24 20:07 VfBfoerst

Thanks for this information. This is helpful. I will give you an update once we have the support live

karthikscale3 avatar Jul 31 '24 19:07 karthikscale3

@VfBfoerst - Just wanted to provide an update. We are still working through the support. We should have an update for you next week. Thanks for your patience.

karthikscale3 avatar Aug 08 '24 16:08 karthikscale3

@VfBfoerst - Apologies for the delay here. We have added support for LiteLLM. Please update to the latest version of langtrace python sdk to see the changes. Thanks for your patience.

karthikscale3 avatar Sep 29 '24 16:09 karthikscale3

LiteLLM support is live from both the ends now:

  • Langtrace supports LiteLLM instrumentation automatically
  • You can configure Langtrace inside LiteLLM to export traces too

karthikscale3 avatar Oct 13 '24 17:10 karthikscale3