Skip to content

feat: add rating to CrUX field data metrics#1190

Open
mvanhorn wants to merge 2 commits intoChromeDevTools:mainfrom
mvanhorn:osc/1055-crux-rating
Open

feat: add rating to CrUX field data metrics#1190
mvanhorn wants to merge 2 commits intoChromeDevTools:mainfrom
mvanhorn:osc/1055-crux-rating

Conversation

@mvanhorn
Copy link
Copy Markdown
Contributor

Summary

  • Adds Web Vitals ratings ([good], [needs-improvement], [poor]) to each CrUX field metric in the performance trace summary output
  • Ratings use the official Web Vitals thresholds from https://web.dev/articles/vitals:
    • LCP: good <= 2500ms, poor >= 4000ms
    • CLS: good <= 0.1, poor >= 0.25
    • INP: good <= 200ms, poor >= 500ms
    • FCP: good <= 1800ms, poor >= 3000ms
    • TTFB: good <= 800ms, poor >= 1800ms
  • Post-processes the DevTools formatter output to append ratings to CrUX field metric lines without modifying the bundled DevTools frontend

Example output change

Before:

  - LCP: 2595 ms (scope: url)
  - INP: 140 ms (scope: url)
  - CLS: 0.06 (scope: url)

After:

  - LCP: 2595 ms (scope: url) [needs-improvement]
  - INP: 140 ms (scope: url) [good]
  - CLS: 0.06 (scope: url) [good]

Fixes #1055

Test plan

  • Added 10 unit tests for addRatingsToCruxMetrics covering all metrics (LCP, INP, CLS, FCP, TTFB), all rating levels, non-CrUX line passthrough, and multi-line summary handling
  • All existing tests continue to pass
  • TypeScript compiles cleanly

This contribution was developed with AI assistance (Claude Code).

Add Web Vitals ratings (good/needs-improvement/poor) to each CrUX field
metric in the performance trace summary output. Ratings are based on the
official Web Vitals thresholds:
- LCP: good <= 2500ms, poor >= 4000ms
- CLS: good <= 0.1, poor >= 0.25
- INP: good <= 200ms, poor >= 500ms
- FCP: good <= 1800ms, poor >= 3000ms
- TTFB: good <= 800ms, poor >= 1800ms

Fixes ChromeDevTools#1055
@wolfib
Copy link
Copy Markdown
Contributor

wolfib commented Mar 19, 2026

Thanks! I like the end result of this!

How about instead of using regexes to add the ratings to the formatted text, adding the ratings directly in PerformanceTraceFormatter.formatTraceFormatter()? This way the whole summary would be generated in a single place, instead of building, analyzing, and then appending.

Replace regex post-processing of formatter output with direct access
to CrUX field metrics via DevTools.TraceEngine.Insights.Common. Ratings
are now computed from structured data and injected into the summary,
avoiding fragile string parsing.
@mvanhorn
Copy link
Copy Markdown
Contributor Author

Replaced regex post-processing with buildRatedCruxSection() that accesses CrUX metrics directly via the structured API. Ratings are now computed from numeric data in formatTraceFormatter() rather than parsing formatted strings.

@wolfib
Copy link
Copy Markdown
Contributor

wolfib commented Mar 20, 2026

Oh, in my first review I missed that the PerformanceTraceFormatter is part of the DevTools repo, and does not live in chrome-devtools-mcp. I still think that ideally all the formatting should happen in a single place, and not be spread over multiple places.

@jackfranklin Would DevTools also benefit from having CrUX ratings added to PerformanceTraceFormatter.formatTraceSummary()? Or if it doesn't make sense for DevTools, how about adding a includeRating boolean param to formatTraceSummary which would control whether ratings are included in the output or not?

@jackfranklin
Copy link
Copy Markdown
Contributor

I like the idea, but yes we should add that in the trace formatter on the DevTools side, and then roll that into the MCP server.

@mvanhorn
Copy link
Copy Markdown
Contributor Author

@wolfib @jackfranklin Got it - makes sense to add the rating logic in the DevTools trace formatter first and then have the MCP server pick it up from there. I can open a PR against the DevTools repo for the PerformanceTraceFormatter changes if that's helpful, or would you prefer to handle it on that side?

@jackfranklin
Copy link
Copy Markdown
Contributor

CLs (or PRs in GitHub terms!) on DevTools are very welcome; see here for how to get the codebase up and running: https://chromium.googlesource.com/devtools/devtools-frontend/+/refs/heads/chromium/3965/README.md

If you send a CL, please add jacktfranklin@chromium.org as a reviewer.

@mvanhorn
Copy link
Copy Markdown
Contributor Author

Thanks for the link. I'll set up the DevTools frontend environment and submit a CL adding rating support to PerformanceTraceFormatter.formatTraceSummary() with you as reviewer. Will update here once it's up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

CrUX field data should include rating if result is good or bad

3 participants