Problem
When Claude Code compacts conversation context, all dynamic state injected by LoadContext at session start is lost:
- Active work sessions and their progress
- Algorithm phase and ISC criteria
- Learning signal trends
- Key behavioral rules that prevent common violations
The model continues operating but has amnesia about what it was doing, leading to:
- Restarting completed work
- Losing Algorithm phase mid-execution
- Violating rules it was following before compaction
Related to #798 (PostCompactRecovery) and #793 (compaction context loss).
Proposed fix
Add a PostCompact hook that re-injects critical state from disk:
- Active work sessions — from
MEMORY/STATE/work.json (last 48h)
- Learning signal trends — from
MEMORY/STATE/learning-cache.sh
- Key behavioral rules — hardcoded essentials that prevent common post-compaction violations
- Algorithm state — current phase, ISC criteria, PRD path from most recent
MEMORY/WORK/ entry
All data comes from small cached files — runs in <50ms with no inference calls.
Output is a <system-reminder> block written to stdout, which Claude Code injects into the conversation.
Problem
When Claude Code compacts conversation context, all dynamic state injected by
LoadContextat session start is lost:The model continues operating but has amnesia about what it was doing, leading to:
Related to #798 (PostCompactRecovery) and #793 (compaction context loss).
Proposed fix
Add a
PostCompacthook that re-injects critical state from disk:MEMORY/STATE/work.json(last 48h)MEMORY/STATE/learning-cache.shMEMORY/WORK/entryAll data comes from small cached files — runs in <50ms with no inference calls.
Output is a
<system-reminder>block written to stdout, which Claude Code injects into the conversation.