langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Convert anthropic token reporting to expected langchain format for tracing

Open mejackreed opened this issue 1 year ago • 2 comments

Previously, llmoutput takes the Anthropic response but it is not utilized by the tracing to provide token metrics due to inconsistency with various API responses. This PR aligns llmoutput to be consistent with other providers for langchain parsability in its Tracer and callbacks.

Happy to do this a different way if there is a more preferred approach. (e.g custom response parser for compatibility to an internal api?)

mejackreed avatar Aug 04 '24 13:08 mejackreed

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Aug 4, 2024 1:50pm
langchainjs-docs ✅ Ready (Inspect) Visit Preview Aug 4, 2024 1:50pm

vercel[bot] avatar Aug 04 '24 13:08 vercel[bot]

Thanks for the pointer @bracesproul . So for a consumer of this data (AIMessage), specifically I'm thinking DatadogLLMObsTracer would you expect that consumer to determine the Chat provider type and determine how to report those usage metrics independently?

mejackreed avatar Aug 08 '24 13:08 mejackreed

@mejackreed it feels like it would make more sense to just update the Datadog tracer to use the new standard fields?

jacoblee93 avatar Aug 16 '24 08:08 jacoblee93

Thanks for the pointers. I've created https://github.com/langchain-ai/langchainjs/pull/6552 as an alternative approach.

mejackreed avatar Aug 16 '24 12:08 mejackreed