langtrace
langtrace copied to clipboard
Feature-Request: Support LiteLLM
Hey :)
I wanted to ask for an integration of litellm as we use it as a gateway to different ai providers. It would be very cool to trace the litellm requests. Is there already a way to do so? Or can it be implemented?
Hey @VfBfoerst , we are working on litellm and we are hoping to release that support soon. Probably in about a week from now. I will keep you posted. Are there any specific litellm calls you are hoping to capture? would love to learn more so it meets your expectations.
Awesome! I would like to add langtrace as a somewhat middleware to the litellm proxy. So all tokens/metrics should be traced. Maybe on a api key basis to get statistics/metrics per key. All that to detect in the end bottlenecks, track usage, detect dysfunctional parts.
Thanks for this information. This is helpful. I will give you an update once we have the support live
@VfBfoerst - Just wanted to provide an update. We are still working through the support. We should have an update for you next week. Thanks for your patience.
@VfBfoerst - Apologies for the delay here. We have added support for LiteLLM. Please update to the latest version of langtrace python sdk to see the changes. Thanks for your patience.
LiteLLM support is live from both the ends now:
- Langtrace supports LiteLLM instrumentation automatically
- You can configure Langtrace inside LiteLLM to export traces too