-
Notifications
You must be signed in to change notification settings - Fork 832
fix: use SpanAttributes instead of GenAIAttributes for cache token attributes #3442
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…tributes Fixed AttributeError by replacing GenAIAttributes with SpanAttributes for cache-related token attributes (GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS). These cache token attributes are custom extensions added by OpenLLMetry to support prompt caching features (Anthropic, OpenAI) and are not part of the upstream OpenTelemetry incubating semantic conventions. They are defined in the local SpanAttributes class, not in GenAIAttributes. Changed files: - opentelemetry-instrumentation-langchain/span_utils.py (line 349) - opentelemetry-instrumentation-anthropic/__init__.py (lines 286, 290, 400, 404) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
WalkthroughThis pull request replaces references to Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~5 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro Disabled knowledge base sources:
📒 Files selected for processing (2)
🧰 Additional context used📓 Path-based instructions (1)**/*.py📄 CodeRabbit inference engine (CLAUDE.md)
Files:
🧬 Code graph analysis (2)packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py (1)
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py (2)
🔇 Additional comments (3)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Important
Looks good to me! 👍
Reviewed everything up to 1dc9943 in 58 seconds. Click for details.
- Reviewed
45lines of code in2files - Skipped
0files when reviewing. - Skipped posting
3draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py:286
- Draft comment:
Replaced GenAIAttributes with SpanAttributes for cache read tokens in the async token usage block. This ensures the correct custom attribute is used. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold50%This comment is purely informative, explaining a change that was made without asking for confirmation or suggesting improvements. It doesn't align with the rules for good comments, as it doesn't provide a specific suggestion or ask for confirmation of intent.
2. packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py:400
- Draft comment:
Switched cache creation tokens attribute to use SpanAttributes in the sync token usage block to avoid AttributeError. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold50%This comment seems to be purely informative, explaining a change that was made. It doesn't ask for any specific action or provide a suggestion for improvement. According to the rules, purely informative comments should be removed.
3. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py:346
- Draft comment:
Updated cache read tokens attribute to use SpanAttributes instead of GenAIAttributes, aligning with the custom semantic conventions. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold50%This comment is purely informative, describing a change made in the code without providing any actionable feedback or suggestions. It doesn't ask for confirmation or suggest improvements, nor does it point out any potential issues. Therefore, it violates the rule against making purely informative comments.
Workflow ID: wflow_uNTK4jwkKuzbZIl9
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
|
@coderabbitai generate docstrings |
✅ Actions performedInitiated docstring generation; will generate only if new commits exist. |
|
Note Docstrings generation - SUCCESS |
Docstrings generation was requested by @hayke102. * #3442 (comment) The following files were modified: * `packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py` * `packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py`
Summary
Fixed
AttributeErrorby replacingGenAIAttributeswithSpanAttributesfor cache-related token attributes in Langchain and Anthropic instrumentation packages.Problem
The code was incorrectly attempting to use
GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENSandGenAIAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS, but these attributes don't exist in the upstream OpenTelemetry incubating semantic conventions.These cache token attributes are custom extensions added by OpenLLMetry to support prompt caching features (used by Anthropic, OpenAI, etc.) and are defined in the local
SpanAttributesclass inopentelemetry-semantic-conventions-aipackage.Solution
Replaced
GenAIAttributeswithSpanAttributesfor both cache token attributes:GEN_AI_USAGE_CACHE_READ_INPUT_TOKENSGEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENSFiles Changed
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.pypackages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.pyImpact
This bug would have caused
AttributeErrorat runtime when processing responses that include cache token information (e.g., Anthropic's prompt caching, OpenAI's cached tokens).Testing
The fix aligns with the pattern already used correctly in other instrumentation packages:
opentelemetry-instrumentation-openai(correctly usesSpanAttributes.LLM_USAGE_CACHE_READ_INPUT_TOKENS)traceloop-sdk(correctly usesSpanAttributesfor cache attributes)🤖 Generated with Claude Code
Important
Fixes
AttributeErrorby replacingGenAIAttributeswithSpanAttributesfor cache token attributes in Langchain and Anthropic packages.GenAIAttributeswithSpanAttributesfor cache token attributesGEN_AI_USAGE_CACHE_READ_INPUT_TOKENSandGEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS.AttributeErrorwhen processing cache token information in Langchain and Anthropic packages.span_utils.pyin Langchain: Line 349.__init__.pyin Anthropic: Lines 286, 290, 400, 404.AttributeErrorin cache token processing.opentelemetry-instrumentation-openaiandtraceloop-sdk.This description was created by
for 1dc9943. You can customize this summary. It will automatically update as commits are pushed.
Summary by CodeRabbit