From RAG to reproducible agents: five practical reads
We replaced RAG with a virtual filesystem for our AI documentation assistant. They replace RAG with a virtual filesystem that lets agents grep, ls, and cat docs instantly, cutting boot time to ~100ms and cost to zero. Outcome engineers get a concrete interface pattern for fast, debuggable context access that simplifies Map and Tech Island work.
Karpathy shares ‘LLM Knowledge Base’ architecture that bypasses RAG with an evolving markdown library maintained by AI. Karpathy proposes an LLM-maintained Markdown knowledge base that compiles, lints, links, and replaces RAG for mid-sized datasets. This offers a practical docs-as-artifact approach you can maintain and audit to keep context predictable and versioned.
Components of a Coding Agent. Sebastian Raschka breaks coding agents into six essential components—context, tools, memory, harness, and more—showing how they fit together into a usable pipeline. Use this as a checklist when assembling agent stacks so you don’t retrofit missing pieces during production rollouts.
OpenAI superapp plan: Interviews with Codex lead Alexander Embiricos and OpenClaw’s Peter Steinberger. OpenAI plans to merge ChatGPT with Codex, turning code-backed conversation into a first-class platform interface for tooling and orchestration. If agents become the UI, outcome engineers must redesign tool access, capability gating, and orchestration primitives accordingly.
Async Python Is Secretly Deterministic. DBOS makes async Python workflows replayable by deterministically assigning step IDs before the first await, enabling reliable checkpointed recovery. That deterministic execution model gives you a practical foundation for durable, observable agent workflows and safe restart/retry semantics.