Convert anthropic token reporting to expected langchain format for tracing
Previously, llmoutput takes the Anthropic response but it is not utilized by the tracing to provide token metrics due to inconsistency with various API responses. This PR aligns llmoutput to be consistent with other providers for langchain parsability in its Tracer and callbacks.
Happy to do this a different way if there is a more preferred approach. (e.g custom response parser for compatibility to an internal api?)
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| langchainjs-api-refs | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Aug 4, 2024 1:50pm |
| langchainjs-docs | ✅ Ready (Inspect) | Visit Preview | Aug 4, 2024 1:50pm |
Thanks for the pointer @bracesproul . So for a consumer of this data (AIMessage), specifically I'm thinking DatadogLLMObsTracer would you expect that consumer to determine the Chat provider type and determine how to report those usage metrics independently?
@mejackreed it feels like it would make more sense to just update the Datadog tracer to use the new standard fields?
Thanks for the pointers. I've created https://github.com/langchain-ai/langchainjs/pull/6552 as an alternative approach.