|
1 | 1 | --- |
2 | 2 | id: ssr-performance-600-percent |
3 | | -title: "From 3000ms to 14ms: profiling hot paths and eliminating bottlenecks in TanStack Start" |
| 3 | +title: 'From 3000ms to 14ms: profiling hot paths and eliminating bottlenecks in TanStack Start' |
4 | 4 | --- |
5 | 5 |
|
6 | 6 | ## Executive summary |
@@ -83,21 +83,23 @@ Placeholders you should replace with real screenshots: |
83 | 83 | **Environment:** |
84 | 84 |
|
85 | 85 | Our benchmarks were stable enough to produce very similar results on a range of setups. However here are the exact environment details we used to run the benchmarks: |
| 86 | + |
86 | 87 | - Node.js: v24.12.0 |
87 | 88 | - Hardware: Macbook Pro M3 |
88 | 89 | - OS: macOS 15.7 |
89 | 90 |
|
90 | 91 | **Running the benchmark:** |
91 | 92 |
|
92 | 93 | For fast iteration, we setup a single `pnpm bench` command what would concurrently |
| 94 | + |
93 | 95 | - start the built server through `@platformatic/flame` to profile it |
94 | | - ```sh |
95 | | - flame run ./dist/server.mjs |
96 | | - ``` |
| 96 | + ```sh |
| 97 | + flame run ./dist/server.mjs |
| 98 | + ``` |
97 | 99 | - run `autocannon` to stress the server by firing many requests at it |
98 | | - ```sh |
99 | | - autocannon -d 30 -c 100 --warmup [ -d 2 -c 20 ] http://localhost:3000/bench/links-100 |
100 | | - ``` |
| 100 | + ```sh |
| 101 | + autocannon -d 30 -c 100 --warmup [ -d 2 -c 20 ] http://localhost:3000/bench/links-100 |
| 102 | + ``` |
101 | 103 |
|
102 | 104 | ## Finding 1: `URL` is expensive in server hot paths |
103 | 105 |
|
@@ -159,7 +161,9 @@ This is the difference between "server = a function" and "client = a reactive sy |
159 | 161 |
|
160 | 162 | ```typescript |
161 | 163 | // Before: same code path for client and server |
162 | | -store.subscribe(() => { /* ... */ }) // overhead on server |
| 164 | +store.subscribe(() => { |
| 165 | + /* ... */ |
| 166 | +}) // overhead on server |
163 | 167 | const next = replaceEqualDeep(prev, value) // unnecessary structural sharing |
164 | 168 |
|
165 | 169 | // After: server gets a simple snapshot |
@@ -277,6 +281,9 @@ There were many other improvements (client and server) not covered here. SSR per |
277 | 281 | ## References |
278 | 282 |
|
279 | 283 | [^v8-fast-properties]: V8 team, "Fast properties in V8" `https://v8.dev/blog/fast-properties` |
| 284 | + |
280 | 285 | [^webkit-delete-ic]: WebKit, "A Tour of Inline Caching with Delete" `https://webkit.org/blog/10298/inline-caching-delete/` |
| 286 | + |
281 | 287 | [^structural-sharing]: Structural sharing is a pattern from immutable data libraries (Immer, React Query, TanStack Store) where unchanged portions of data structures are reused by reference to minimize allocation and enable cheap equality checks. |
| 288 | + |
282 | 289 | [^ssr-streaming]: With streaming SSR and Suspense, the server may render multiple chunks, but each chunk is still a single-pass render with no reactive updates. |
0 commit comments