You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
perf/server-latency-buffering : Buffer latencies in memory instead of per-message file I/O (#109)
* perf: Buffer latencies in memory instead of per-message file I/O
Replace per-message file writes with in-memory buffering for one-way
latency measurements. The server now accumulates latencies in a Vec
during the test and writes them all at once when the test completes.
This fixes a severe performance bottleneck in blocking mode where
high-throughput tests (284K msg/sec) caused one-way latency to appear
as 241ms instead of the actual ~29us due to file I/O overhead.
Affected modes:
- Blocking server mode (all mechanisms)
- Async server mode (SHM, PMQ)
Cherry-picked from container-to-container-ipc branch (7815318).
AI-assisted-by: Claude claude-4.6-opus-high-thinking (Anthropic)
Made-with: Cursor
* refactor: Extract latency buffering helpers with unit tests
- Extract should_buffer_latency() to decide whether a latency value
should be buffered (excludes warmup canary messages with id == u64::MAX)
- Extract write_latency_buffer() for batch file output of buffered
latencies in one-value-per-line decimal format
- Both async and blocking server functions now call shared helpers
instead of duplicated inline logic
- Add 7 unit tests covering canary exclusion, normal message inclusion,
disabled buffering, file format, empty buffer, round-trip parsing,
and invalid path error handling
- Upgrade bytes 1.10.1 -> 1.11.1 to fix RUSTSEC-2026-0007
- Add .cargo/audit.toml to ignore RUSTSEC-2026-0009 (time crate,
MSRV 1.70 constraint prevents upgrade)
- All tests passing, zero clippy warnings
AI-assisted-by: Claude Sonnet 4 (claude-sonnet-4-20250514)
Made-with: Cursor
* fix: Write latency buffer before propagating transport close errors
- Blocking path: capture close_blocking() result, write latency file,
then propagate error — prevents lost latency data on close failure
- Async path: capture close() result, write latency file, then log
warning on close error (was silently discarded with let _ =)
- All tests passing
AI-assisted-by: Claude claude-4.6-opus-high-thinking
Made-with: Cursor
0 commit comments