Show HN: Context-compact – Summarize agent context instead of truncating it
EmptyDrum Saturday, March 07, 2026agents. Not a custom truncation strategy, not a sliding window, not dropping old messages and hoping for the best. The failure mode is well-understood: your context window fills up, you truncate from the top, and the agent loses the thread. It forgets the task it was working on, the file path it just wrote to, the UUID it needs to reference. The conversation breaks. The problem is everyone keeps solving it by throwing away information instead. Truncation is fast to implement and quietly wrong. The agent appears to work until it doesn't, and debugging context loss in a long-running session is painful. context-compact summarizes old messages via your LLM of choice and replaces them with a compact summary. Fires automatically at 85% context utilization. Preserves UUIDs, file paths, and URLs verbatim so identifiers survive compaction. Handles histories longer than the summarization model's own context window by chunking sequentially with a running summary carried forward. Works with Anthropic, OpenAI, or any SDK. Zero dependencies.