Skip to content

Conversation

@hayke102
Copy link

@hayke102 hayke102 commented Nov 13, 2025

Summary

Fixed AttributeError by replacing GenAIAttributes with SpanAttributes for cache-related token attributes in Langchain and Anthropic instrumentation packages.

Problem

The code was incorrectly attempting to use GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GenAIAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS, but these attributes don't exist in the upstream OpenTelemetry incubating semantic conventions.

These cache token attributes are custom extensions added by OpenLLMetry to support prompt caching features (used by Anthropic, OpenAI, etc.) and are defined in the local SpanAttributes class in opentelemetry-semantic-conventions-ai package.

Solution

Replaced GenAIAttributes with SpanAttributes for both cache token attributes:

  • GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS
  • GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS

Files Changed

  1. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py

    • Line 349: Fixed cache read tokens attribute
  2. packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py

    • Lines 286, 290, 400, 404: Fixed both cache read and cache creation tokens attributes in sync and async functions

Impact

This bug would have caused AttributeError at runtime when processing responses that include cache token information (e.g., Anthropic's prompt caching, OpenAI's cached tokens).

Testing

The fix aligns with the pattern already used correctly in other instrumentation packages:

  • opentelemetry-instrumentation-openai (correctly uses SpanAttributes.LLM_USAGE_CACHE_READ_INPUT_TOKENS)
  • traceloop-sdk (correctly uses SpanAttributes for cache attributes)

🤖 Generated with Claude Code


Important

Fixes AttributeError by replacing GenAIAttributes with SpanAttributes for cache token attributes in Langchain and Anthropic packages.

  • Behavior:
    • Replaced GenAIAttributes with SpanAttributes for cache token attributes GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS.
    • Fixes AttributeError when processing cache token information in Langchain and Anthropic packages.
  • Files Changed:
    • span_utils.py in Langchain: Line 349.
    • __init__.py in Anthropic: Lines 286, 290, 400, 404.
  • Impact:
    • Prevents runtime AttributeError in cache token processing.
  • Testing:
    • Aligns with correct usage in opentelemetry-instrumentation-openai and traceloop-sdk.

This description was created by Ellipsis for 1dc9943. You can customize this summary. It will automatically update as commits are pushed.

Summary by CodeRabbit

  • Refactor
    • Updated cache-related token usage telemetry attributes across OpenTelemetry instrumentation packages for consistent observability data structure.

…tributes

Fixed AttributeError by replacing GenAIAttributes with SpanAttributes for
cache-related token attributes (GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and
GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS).

These cache token attributes are custom extensions added by OpenLLMetry to
support prompt caching features (Anthropic, OpenAI) and are not part of the
upstream OpenTelemetry incubating semantic conventions. They are defined in
the local SpanAttributes class, not in GenAIAttributes.

Changed files:
- opentelemetry-instrumentation-langchain/span_utils.py (line 349)
- opentelemetry-instrumentation-anthropic/__init__.py (lines 286, 290, 400, 404)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@CLAassistant
Copy link

CLAassistant commented Nov 13, 2025

CLA assistant check
All committers have signed the CLA.

@coderabbitai
Copy link

coderabbitai bot commented Nov 13, 2025

Walkthrough

This pull request replaces references to GenAIAttributes cache-related token constants with their SpanAttributes equivalents across two instrumentation packages. In Anthropic instrumentation, two cache token methods are updated. In LangChain instrumentation, one cache token reference is updated. No control flow changes are introduced.

Changes

Cohort / File(s) Summary
Cache token attribute constant updates
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py
Replaced GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GenAIAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS with SpanAttributes equivalents in _aset_token_usage and _set_token_usage methods
Cache token attribute constant updates
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py
Replaced GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS with SpanAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS in set_chat_response method

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~5 minutes

Possibly related PRs

  • PR #3437: Performs identical constant replacements across different instrumentation modules, changing from GenAIAttributes to SpanAttributes for cache-related token metrics.

Suggested reviewers

  • nirga

Poem

🐰 From GenAI to Span, the tokens find their way,
Cache metrics dancing in attributes' ballet,
Constants realigned, consistent and bright,
Two packages harmonized, everything right! ✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title clearly and concisely describes the main change: replacing GenAIAttributes with SpanAttributes for cache token attributes across two instrumentation packages.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 8b91ed7 and 1dc9943.

📒 Files selected for processing (2)
  • packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py (2 hunks)
  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

**/*.py: Store API keys only in environment variables/secure vaults; never hardcode secrets in code
Use Flake8 for code linting and adhere to its rules

Files:

  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py
  • packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py
🧬 Code graph analysis (2)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py (1)
packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py (1)
  • SpanAttributes (64-245)
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py (2)
packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py (1)
  • SpanAttributes (64-245)
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/utils.py (1)
  • set_span_attribute (21-25)
🔇 Additional comments (3)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py (1)

349-349: LGTM! Correct attribute reference fix.

The change from GenAIAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS to SpanAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS is correct. Cache token attributes are custom OpenLLMetry extensions defined in SpanAttributes (as shown in the relevant snippets), not part of the standard GenAI attributes. This fix prevents the AttributeError that would occur when accessing the non-existent attribute from GenAIAttributes.

packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py (2)

285-292: LGTM! Correct attribute reference fix in async function.

The changes from GenAIAttributes to SpanAttributes for both cache token attributes (GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS and GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS) are correct. These are custom OpenLLMetry extensions defined in SpanAttributes, not part of the standard GenAI semantic conventions. This fix prevents AttributeError when processing Anthropic responses with cache token information.


399-406: LGTM! Correct attribute reference fix in sync function.

The changes mirror the async version (_aset_token_usage) and correctly use SpanAttributes for the cache token attributes. This ensures consistent behavior between synchronous and asynchronous code paths.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to 1dc9943 in 58 seconds. Click for details.
  • Reviewed 45 lines of code in 2 files
  • Skipped 0 files when reviewing.
  • Skipped posting 3 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py:286
  • Draft comment:
    Replaced GenAIAttributes with SpanAttributes for cache read tokens in the async token usage block. This ensures the correct custom attribute is used.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is purely informative, explaining a change that was made without asking for confirmation or suggesting improvements. It doesn't align with the rules for good comments, as it doesn't provide a specific suggestion or ask for confirmation of intent.
2. packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py:400
  • Draft comment:
    Switched cache creation tokens attribute to use SpanAttributes in the sync token usage block to avoid AttributeError.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment seems to be purely informative, explaining a change that was made. It doesn't ask for any specific action or provide a suggestion for improvement. According to the rules, purely informative comments should be removed.
3. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py:346
  • Draft comment:
    Updated cache read tokens attribute to use SpanAttributes instead of GenAIAttributes, aligning with the custom semantic conventions.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is purely informative, describing a change made in the code without providing any actionable feedback or suggestions. It doesn't ask for confirmation or suggest improvements, nor does it point out any potential issues. Therefore, it violates the rule against making purely informative comments.

Workflow ID: wflow_uNTK4jwkKuzbZIl9

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@hayke102
Copy link
Author

@coderabbitai generate docstrings

@coderabbitai
Copy link

coderabbitai bot commented Nov 13, 2025

✅ Actions performed

Initiated docstring generation; will generate only if new commits exist.

@coderabbitai
Copy link

coderabbitai bot commented Nov 13, 2025

Note

Docstrings generation - SUCCESS
Generated docstrings for this pull request at #3443

coderabbitai bot added a commit that referenced this pull request Nov 13, 2025
Docstrings generation was requested by @hayke102.

* #3442 (comment)

The following files were modified:

* `packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py`
* `packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants