Listen to today's AI briefing

Daily podcast — 5 min, AI-narrated summary of top stories

How Telemetry Settings Are Silently Costing You Cache Tiers (And How To Fix It)

How Telemetry Settings Are Silently Costing You Cache Tiers (And How To Fix It)

A confirmed bug links telemetry settings to cache TTL; disabling telemetry defaults you to 5-minute cache, increasing costs. Use environment variables and hooks to mitigate.

GAla Smith & AI Research Desk·4h ago·4 min read·4 views·AI-Generated
Share:
Source: github.comvia hn_claude_code, reddit_claudeSingle Source
How Telemetry Settings Are Silently Costing You Cache Tiers (And How To Fix It)

What Changed — The Telemetry-Cache Bug

A confirmed GitHub issue (#45381) reveals that disabling telemetry in Claude Code—via DISABLE_TELEMETRY=1 or CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1—now also disables access to the 1-hour prompt cache TTL. Sessions that would normally qualify for the longer cache tier automatically fall back to the 5-minute TTL when telemetry is off.

This bug compounds a previously unannounced change from April 2, 2025, when Anthropic quietly switched the default cache TTL from 1 hour to 5 minutes for most users. The telemetry bug means privacy-conscious developers who disable tracking are now doubly penalized with shorter cache windows and higher token costs.

How To Check Your Cache Tier

You can verify which cache tier your sessions are using by examining the session transcript metadata. Look for these fields in the assistant response:

"usage": {
  "cache_creation": {
    "ephemeral_1h_input_tokens": 0,
    "ephemeral_5m_input_tokens": 12345
  }
}

If ephemeral_1h_input_tokens is non-zero, you're on the 1-hour tier. If ephemeral_5m_input_tokens is non-zero, you're stuck with 5 minutes.

The Cost Impact

Data from users tracking their sessions shows the financial impact:

  • Before April 2 (1-hour cache): ~39 cache busts per day, ~$6.28/day in bust-triggered costs
  • After April 2 (5-minute cache): ~199 cache busts per day (5.1× increase), ~$15.54/day
  • Monthly delta: Approximately $277.80 extra

The shorter cache also triggers a compounding problem: when cache expires mid-session, Claude loses confidence in previously read files and starts re-reading them, padding conversation history and making the next cache rebuild even more expensive.

Immediate Workarounds

1. Re-enable Telemetry (Temporarily)

If you need the 1-hour cache tier, you'll need to keep telemetry enabled until Anthropic fixes this bug. Remove these environment variables:

# Remove these from your shell profile or session:
unset DISABLE_TELEMETRY
unset CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC

2. Use the Bedrock Workaround

If you're using Claude via AWS Bedrock, set this environment variable:

export ENABLE_PROMPT_CACHING_1H_BEDROCK=1

This should restore 1-hour caching regardless of telemetry settings.

3. Install Cache Monitoring Hooks

The claude-memory plugin from the Claudest repository includes hooks that warn you about cache expiration:

/plugin marketplace add gupsammy/claudest
/plugin install claude-memory@claudest

Then run /get-token-insights to analyze your patterns. The plugin offers to install these hooks automatically:

  • plugins/claude-memory/hooks/cache-warn-stop.py
  • plugins/claude-memory/hooks/cache-expiry-warn.py
  • plugins/claude-memory/hooks/cache-warn-3min.sh

Add them to ~/.claude/settings.json under the appropriate event handlers.

4. Adjust Your Context Strategy

Since cache busts are now more frequent, limit the damage:

{
  "env": {
    "CLAUDE_CODE_DISABLE_1M_CONTEXT": "1"
  }
}

This caps context at 200K tokens instead of 1 million. When cache expires, you rebuild from scratch—smaller context means cheaper rebuilds.

5. Fix Git-Related Cache Busts

Another cache issue involves git status updates. Use:

CLAUDE_CODE_DISABLE_GIT_INSTRUCTIONS=1 claude "Hello"

Or set includeGitInstructions=false in settings.json.

When To Start Fresh

Boris, the creator of Claude Code, recommends in GitHub issue #45756: if you leave an agent session open too long (causing full cache misses), it's better to start a new conversation rather than continue with expensive cache rebuilds.

The Bigger Picture

This telemetry-cache coupling appears to be an unintended side effect of how Claude Code manages session metadata. The cache system likely relies on telemetry data to determine tier eligibility, and when that data stream is cut off, it defaults to the lowest tier.

Until Anthropic releases a fix, developers face a trade-off: privacy (disabling telemetry) versus cost efficiency (1-hour cache tier). For now, most users will need to choose based on their priorities and budget.

Action Items:

  1. Check your current cache tier using the metadata method above
  2. Decide whether to temporarily re-enable telemetry for cost savings
  3. Install monitoring hooks to track cache bust frequency
  4. Consider capping your context window to limit rebuild costs
  5. Use /feedback in Claude Code to report your experience—the team is actively investigating

Following this story?

Get a weekly digest with AI predictions, trends, and analysis — free.

AI Analysis

Claude Code users should immediately check which cache tier they're on by examining session metadata. If you see `ephemeral_5m_input_tokens` and you've disabled telemetry, you're experiencing this bug. For developers who disabled telemetry for privacy reasons, you now face a direct cost impact. Each cache bust at 5-minute intervals costs approximately 2.5× more frequently than the previous 1-hour tier. If you work on long sessions or use backgrounded tasks (like `/loop` commands that take over 5 minutes), every return triggers a full cache rebuild. Change your workflow: 1) Consider temporarily re-enabling telemetry if cost is a concern, 2) Install the `claude-memory` plugin and run `/get-token-insights` to see your actual bust patterns, 3) Cap your context at 200K tokens with `CLAUDE_CODE_DISABLE_1M_CONTEXT=1` to make rebuilds cheaper, and 4) Be more aggressive about starting fresh conversations rather than continuing sessions with expired caches.

Mentioned in this article

Enjoyed this article?
Share:

Related Articles

More in Products & Launches

View all