context-gatekeeper
✓Verified·Scanned 2/17/2026
This skill compacts conversation history and writes a briefing to context/current-summary.md to reduce tokens. It runs local scripts (e.g., python skills/context-gatekeeper/scripts/context_gatekeeper.py) via subprocess.run(...), performing local file read/write but no external network or secret access.
from clawhub.ai·v9fe1c8c·16.3 KB·0 installs
Scanned from 0.1.1 at 9fe1c8c · Transparency log ↗
$ vett add clawhub.ai/davienzomq/context-gatekeeper
Context Gatekeeper — Full documentation
Purpose
Context Gatekeeper keeps OpenClaw sessions token-efficient by compressing the active conversation. It summarizes the tail of the thread, spots open tasks, and keeps a short log of the latest turns so you only send what matters to the model.
Repository layout
skills/context-gatekeeper/
├── SKILL.md # Meta description (triggers and usage)
├── README.md # This reference for ClawHub and contributors
├── scripts/
│ ├── context_gatekeeper.py # Builds the compact summary, highlights, and recent-turn log
│ ├── auto_monitor.py # Watches history.txt and regenerates the summary automatically
│ └── ensure_context_monitor.sh # Starts (or restarts) the monitor safely
├── context/
│ ├── history.txt # Rolling log (ROLE: message)
│ ├── current-summary.md # Briefing used before every reply
│ └── sample-history.txt # Test data for quick verification
└── README.md # This document
Scripts overview
context_gatekeeper.py
- Reads the history file (ROLE: message per line) and slices the text into sentences.
- Picks up pending tasks by keyword (todo, task, follow-up, next step, etc.) and keeps up to the last four turns.
- Outputs
current-summary.mdcontaining: timestamp, "Compact summary", "Pendencies and next steps" and "Last turns" sections. - You can tweak output length via
--max-summary-sents,--max-pendings, and--max-recent-turns.
auto_monitor.py
- Runs in a loop and watches
context/history.txtfor modification time changes. - When new content appears, it executes
context_gatekeeper.pyand logs the event to/tmp/context-gatekeeper-monitor.log. - Ensures the summary is fresh before each answer, supporting 24/7 operation.
ensure_context_monitor.sh
- Checks for an existing
auto_monitor.pyprocess (pgrep -f auto_monitor.py). If none exists, it launches the monitor withnohupand saves logs. - Designed to be called at startup (
STARTUP.md) so the monitor automatically restarts after/reset,/new, or any reboot.
Daily workflow
- Append every incoming and outgoing message as
USER: .../ASSISTANT: ...incontext/history.txt. - The monitor detects the change and rewrites
context/current-summary.mdwith the current briefing. - Before calling the model, load the summary (and the last few turns if necessary) instead of the entire history.
- Run
/session_statusor the equivalent command to capture token consumption before responding. - Always append a footer line
tokens in <qty> / tokens out <qty>to every reply so the usage is auditable. - When you want to force the summary update manually, run:
python3 scripts/context_gatekeeper.py \
--history context/history.txt \
--summary context/current-summary.md
Publication details
- Published as
context-gatekeeper@0.1.1(latest release). The version bundle includes this README plus all scripts and context templates so any user can install viaclawhub install context-gatekeeper. - Author: Davi Marques. Repository slug:
context-gatekeeper.
Installation recipe
- Ensure
skills/context-gatekeeperexists under<workspace>/skills. - Run
./scripts/ensure_context_monitor.shor rely on the STARTUP runner so the monitor stays alive. - Keep
context/history.txtupdated and inspectcontext/current-summary.mdbefore talking to the model. - Keep
/session_statusoutput saved together with each response (the README explains why the token line is mandatory).
Recommendations
- Limit the history file to the essential RECENT exchanges that drive the next turn.
- Watch
/tmp/context-gatekeeper-monitor.logfor monitor errors or long pauses. - Update
memory/daily/YYYY-MM-DD.mdandTOOLS.mdwhenever the workflow changes so the rest of the team stays aligned.