fix: use SpanAttributes instead of GenAIAttributes for cache token attributes
Summary
Fixed AttributeError by replacing GenAIAttributes with SpanAttributes for cache-related token attributes in Langchain and Anthropic instrumentation packages.
Problem
The code was incorrectly attempting to use GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GenAIAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS, but these attributes don't exist in the upstream OpenTelemetry incubating semantic conventions.
These cache token attributes are custom extensions added by OpenLLMetry to support prompt caching features (used by Anthropic, OpenAI, etc.) and are defined in the local SpanAttributes class in opentelemetry-semantic-conventions-ai package.
Solution
Replaced GenAIAttributes with SpanAttributes for both cache token attributes:
-
GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS -
GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS
Files Changed
-
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py- Line 349: Fixed cache read tokens attribute
-
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py- Lines 286, 290, 400, 404: Fixed both cache read and cache creation tokens attributes in sync and async functions
Impact
This bug would have caused AttributeError at runtime when processing responses that include cache token information (e.g., Anthropic's prompt caching, OpenAI's cached tokens).
Testing
The fix aligns with the pattern already used correctly in other instrumentation packages:
-
opentelemetry-instrumentation-openai(correctly usesSpanAttributes.LLM_USAGE_CACHE_READ_INPUT_TOKENS) -
traceloop-sdk(correctly usesSpanAttributesfor cache attributes)
π€ Generated with Claude Code
[!IMPORTANT] Fixes
AttributeErrorby replacingGenAIAttributeswithSpanAttributesfor cache token attributes in Langchain and Anthropic packages.
- Behavior:
- Replaced
GenAIAttributeswithSpanAttributesfor cache token attributesGEN_AI_USAGE_CACHE_READ_INPUT_TOKENSandGEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS.- Fixes
AttributeErrorwhen processing cache token information in Langchain and Anthropic packages.- Files Changed:
span_utils.pyin Langchain: Line 349.__init__.pyin Anthropic: Lines 286, 290, 400, 404.- Impact:
- Prevents runtime
AttributeErrorin cache token processing.- Testing:
- Aligns with correct usage in
opentelemetry-instrumentation-openaiandtraceloop-sdk.This description was created by
for 1dc9943ed1e8b71a72bfcf3dfa8378d5276b1937. You can customize this summary. It will automatically update as commits are pushed.
Summary by CodeRabbit
-
Refactor
- Updated cache-related token usage telemetry attributes across OpenTelemetry instrumentation packages for consistent observability data structure.
Walkthrough
This pull request replaces cache-related token attribute constants from GenAIAttributes to SpanAttributes in two instrumentation packages (Anthropic and LangChain). No control-flow or error-handling changes were made.
Changes
| Cohort / File(s) | Summary |
|---|---|
Anthropic instrumentation packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py |
Replaced GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS with SpanAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GenAIAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS with SpanAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS in _aset_token_usage and _set_token_usage. |
LangChain instrumentation packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py |
Replaced GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS with SpanAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS in set_chat_response. |
Estimated code review effort
π― 2 (Simple) | β±οΈ ~10 minutes
Potential review focus:
- Confirm correct import/source of
SpanAttributesand that no other GenAIAttributes usages remain for these keys. - Verify unit tests or span-attribute consumers expect the new attribute names.
Possibly related PRs
- traceloop/openllmetry#3437: Performs the same replacement of
GenAIAttributescache-related constants withSpanAttributesequivalents across instrumentation modules.
Poem
π° I hopped through constants, small and bright,
Swapping GenAI for Span in the soft moonlight.
Tokens now rest where names align,
Cache metrics tidy, neat, and fine.
π₯β¨
Pre-merge checks and finishing touches
β Failed checks (2 warnings)
| Check name | Status | Explanation | Resolution |
|---|---|---|---|
| Title check | β οΈ Warning | The title mentions 'openai' but the PR primarily fixes cache token attribute issues in Langchain and Anthropic instrumentation, not OpenAI. | Revise title to accurately reflect the packages being fixed, such as: 'fix: use SpanAttributes instead of GenAIAttributes for cache token attributes in anthropic and langchain' |
| Docstring Coverage | β οΈ Warning | Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. | You can run @coderabbitai generate docstrings to improve docstring coverage. |
β Passed checks (1 passed)
| Check name | Status | Explanation |
|---|---|---|
| Description Check | β Passed | Check skipped - CodeRabbitβs high-level summary is enabled. |
β¨ Finishing touches
- [ ] π Generate docstrings
π§ͺ Generate unit tests (beta)
- [ ] Create PR with unit tests
- [ ] Post copyable unit tests in a comment
π Recent review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
π₯ Commits
Reviewing files that changed from the base of the PR and between 1dc9943ed1e8b71a72bfcf3dfa8378d5276b1937 and 9f48bcbce0c5fd41af41c3b58ac7fcbba95e3fe1.
π Files selected for processing (1)
-
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py(2 hunks)
π§ Files skipped from review as they are similar to previous changes (1)
- packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/init.py
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.
Comment @coderabbitai help to get the list of available commands and usage tips.
@coderabbitai generate docstrings
β Actions performed
Initiated docstring generation; will generate only if new commits exist.
[!NOTE] Docstrings generation - SUCCESS Generated docstrings for this pull request at https://github.com/traceloop/openllmetry/pull/3443