Entity: CaptureLoop
Brief: The core knowledge accumulation loop: raw material is captured into a queue, synthesized by an LLM into structured wiki entries, then the queue is cleared.
Overview
The capture-synthesize loop is the central operational pattern of Ourobor OS. It defines how knowledge moves from the codebase into the persistent wiki without interrupting the development flow.
The Loop
[Code / Architecture Notes]
|
| python ouro/scripts/capture.py --crawl
| python ouro/scripts/capture.py path/to/file.py
| python ouro/scripts/capture.py "raw note"
v
ouro/wiki/capture-queue.md (staging area)
|
| LLM reads queue, synthesizes entries
v
ouro/wiki/entities/ (module docs)
ouro/wiki/patterns/ (reusable patterns)
ouro/wiki/decisions/ (ADRs)
|
| python ouro/scripts/capture.py --pop
v
Entry removed from queue
|
| Update ouro/wiki/index.md
v
Wiki catalog stays current
Roles
| Actor | Responsibility |
|---|---|
| Developer | Triggers captures; curates what gets staged |
capture.py |
Moves raw content into the queue |
| LLM agent | Reads queue, synthesizes structured docs, pops processed entries |
index.md |
Maintained as the always-current catalog |
When to Trigger a Capture
- After writing or significantly refactoring a module
- When making an architectural decision worth preserving
- When identifying a reusable pattern
- At the start of a session — use
--crawl --gitto catch drift on recently touched files; use bare--crawlonly for initial wiki population - Before a release (verify parity between code and wiki)
Synthesis Guidelines
When the LLM processes a queue entry:
- Read the full capture to understand context.
- Check
index.mdand existing entities — don't create duplicates. - Determine the correct destination:
entities/,patterns/, ordecisions/. - Apply required tags:
@entityand@briefare mandatory. - Extract critical code logic using
@snippetblocks. - Run
capture.py --popto remove the processed entry. - Update
index.mdwith a link to the new file.
Note: Synthesis is not automated — it requires an LLM agent in an active session. The queue is a buffer and staging area, not an autonomous pipeline.
Warning: Avoid letting the queue grow stale. Long queues lose context fidelity and become expensive for the LLM to process in a single session. Process incrementally, not in bulk.