Skip to content

fix(context-timeline): detect Opus 1M context window dynamically#548

Merged
davila7 merged 1 commit into
davila7:mainfrom
mario-hernandez:fix/context-timeline-dynamic-limit
Apr 28, 2026
Merged

fix(context-timeline): detect Opus 1M context window dynamically#548
davila7 merged 1 commit into
davila7:mainfrom
mario-hernandez:fix/context-timeline-dynamic-limit

Conversation

@mario-hernandez
Copy link
Copy Markdown
Contributor

@mario-hernandez mario-hernandez commented Apr 27, 2026

Problem

CONTEXT_LIMIT is hardcoded to 200_000 tokens in context-timeline.py. Users on Claude Opus 4.7 with the 1M context window ("model": "opus[1m]" in ~/.claude/settings.json) see the dashboard report 80–90% usage when actual usage is 16–18%, producing constant false alarms and making the meter unusable on long sessions.

Fix

Replace the constant with _detect_context_limit(cwd) which resolves the limit in this order:

  1. CONTEXT_TIMELINE_LIMIT env var — explicit override (e.g. export CONTEXT_TIMELINE_LIMIT=500000).
  2. [1m] / [1000k] / [200k] suffix in the model field of <cwd>/.claude/settings.json or ~/.claude/settings.json.
  3. 200_000 fallback — current behaviour, untouched for users without [1m].

The limit is resolved once per _Builder instance, not per update tick — zero per-event filesystem cost.

Backwards compatibility

  • No new dependencies (still stdlib only).
  • Default behaviour unchanged for users without [1m] in settings.
  • No new hooks, no breaking API changes.

Verification

Tested live on a real Opus 1M session of ~178K tokens:

Before After
Header reading 178,581 / 200,000 (89%) 178,581 / 1,000,000 (17.9%)
Bar colour Red / warning Cyan / normal
Sidebar meter Saturated Accurate

Resolution order also unit-tested:

  • env var override → int(env)
  • project .claude/settings.json with [1m]1_000_000
  • global ~/.claude/settings.json with [1m]1_000_000
  • no settings → 200_000

Files changed

cli-tool/components/hooks/monitoring/context-timeline.py (+44, −8)


Summary by cubic

Fixes incorrect context-usage reporting by dynamically detecting the model’s context window, so Opus 1M sessions show accurate percentages instead of false alarms. Replaces the hardcoded 200k limit with a detected value.

  • Bug Fixes
    • Area: components (cli-tool/components/hooks/monitoring/context-timeline.py).
    • Detects limit once per session via _detect_context_limit(cwd): env var CONTEXT_TIMELINE_LIMIT[1m]/[1000k]/[200k] in .claude/settings.json200_000 fallback.
    • No new components; no docs/components.json regeneration needed.
    • New optional env var: CONTEXT_TIMELINE_LIMIT (int). No secrets.

Written for commit 6c3750a. Summary will update on new commits. Review in cubic

CONTEXT_LIMIT was hardcoded at 200_000 tokens. Users on Claude Opus 4.7
with the 1M context window (model: "opus[1m]") saw the dashboard report
80–90% usage when actual usage was 16–18%, producing constant false
alarms and making the meter unusable on long sessions.

Replace the constant with _detect_context_limit(cwd) which resolves the
limit in this order:

1. CONTEXT_TIMELINE_LIMIT env var (explicit override).
2. "[1m]" / "[1000k]" / "[200k]" suffix in the model field of
   <cwd>/.claude/settings.json or ~/.claude/settings.json.
3. 200_000 fallback (current behaviour).

The limit is resolved once per _Builder instance, not per update tick,
so there is no per-event filesystem cost.

No new dependencies (still stdlib only). Default behaviour is unchanged
for users without [1m] in settings.

Verified live on a real Opus 1M session of ~178K tokens: meter went
from 89% (red, false alarm) to 17.9% (cyan, correct).
@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 27, 2026

@mario-hernandez is attempting to deploy a commit to the Daniel Avila's projects Team on Vercel.

A member of the Team first needs to authorize it.

@github-actions github-actions Bot added the review-pending Component PR awaiting maintainer review label Apr 27, 2026
@github-actions
Copy link
Copy Markdown
Contributor

👋 Thanks for contributing, @mario-hernandez!

This PR touches cli-tool/components/** and has been marked review-pending.

What happens next

  1. 🤖 Automated security audit runs and posts results on this PR.
  2. 👀 Maintainer review — a human reviewer validates the component with the component-reviewer agent (format, naming, security, clarity).
  3. Merge — once approved, your PR is merged to main.
  4. 📦 Catalog regeneration — the component catalog is rebuilt automatically.
  5. 🚀 Live on aitmpl.com — your component appears on the website after deploy.

While you wait

  • Check the Security Audit comment below for any issues to fix.
  • Make sure your component follows the contribution guide.

This is an automated message. No action is required from you right now — a maintainer will review soon.

Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 1 file

@davila7 davila7 merged commit 262da50 into davila7:main Apr 28, 2026
3 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-pending Component PR awaiting maintainer review

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants