PATTERN Cited by 2 sources
Read-only curated example filesystem¶
Pattern¶
Instead of leaving the LLM to rely on its parametric knowledge of a library — or to fetch examples via web-search RAG (exposing the pipeline to the telephone game) — co-maintain with the library vendor a read-only filesystem of hand-curated code samples optimised for LLM consumption. Expose the filesystem to the agent; let the agent search it at generation time.
Canonical Vercel framing¶
"In addition to text injection, we worked with the AI SDK team to provide examples in the v0 agent's read-only filesystem. These are hand-curated directories with code samples designed for LLM consumption. When v0 decides to use the SDK, it can search these directories for relevant patterns such as image generation, routing, or integrating web search tools."
(Source: sources/2026-01-08-vercel-how-we-made-v0-an-effective-coding-agent)
Why "designed for LLM consumption" is a distinct¶
artifact
Human documentation and LLM documentation have different optimality criteria:
- Humans prefer narrative, prose explanations, inline commentary, and learning arcs; examples can be truncated because humans fill in gaps.
- LLMs prefer complete, runnable, self-contained code snippets with minimal narrative, canonical imports, and consistent naming; the model doesn't "fill gaps" — it pattern-matches.
Maintaining two copies (one for each audience) is explicit in Vercel's description: "code samples designed for LLM consumption" is contrasted with the public-facing human docs.
The co-maintenance property¶
The pattern depends on the library vendor participating:
- The vendor knows which patterns are canonical, which are deprecated, and which are emerging.
- The vendor has incentive (agent adoption → library adoption) to keep the examples current.
- The agent team has incentive (reliability → success rate) to keep the examples discoverable.
Vercel's specific formulation: "we worked with the AI SDK team" — the systems/ai-sdk team is the collaborator, not an external contributor to a scraped documentation corpus.
Why read-only¶
- Reproducibility. The agent sees the same samples on every invocation (modulo version bumps). This makes generations more deterministic.
- Safety. Agent can't corrupt the knowledge base mid-session or across sessions.
- Auditing. The samples are a fixed artifact, reviewed and signed off; no sneaky runtime modifications.
Categorisation inside the fs¶
Vercel's examples are organised by pattern category: "relevant patterns such as image generation, routing, or integrating web search tools." This is a directory- per-pattern layout — the agent searches directories, not a flat namespace. The categorisation matches the intent-class partitioning used by patterns/dynamic-knowledge-injection-prompt, so the agent is pointed at the right category when the intent is detected.
Complementary to, not a replacement for, direct¶
injection
The read-only-fs pattern is complementary to patterns/dynamic-knowledge-injection-prompt, not a replacement:
- Direct injection carries high-value, compact text knowledge (API surface, version pins, common pitfalls) into the prompt — cheap in tokens, prompt-cache- friendly.
- Curated example fs carries concrete, full-program code samples that would be too large to inject wholesale; the agent searches them on demand when the generation task requires a concrete pattern.
Trade-offs¶
- Curation cost is real. Examples must be written, reviewed, kept current across releases. The agent team alone can't do this; vendor collaboration is load-bearing.
- Coverage gap. Libraries without a friendly vendor partnership don't fit this pattern; the agent falls back to direct injection + parametric knowledge.
- Search-quality dependency. The agent must be able to find the relevant example; naming / indexing / keyword quality in the fs matters.
- Generalisation limit. This works for the "happy path" patterns the vendor curates; the long tail of combinations / edge cases still exercises the model's parametric knowledge.
Boundary: when NOT to curate¶
- Generic / open-domain knowledge. You can't curate "programming in general"; use this for bounded per-library surfaces.
- Libraries with no dedicated agent consumer. Maintenance cost isn't justified without a clear consumer.
- Fast-moving research / pre-release libraries. The curation lag dominates.
Seen in¶
- sources/2026-01-08-vercel-how-we-made-v0-an-effective-coding-agent — canonical pattern; v0 × AI SDK co-maintained read-only directory of LLM-consumption-optimised samples; "image generation, routing, web search" as example categories.
- sources/2026-04-21-vercel-build-knowledge-agents-without-embeddings — enterprise-corpus-altitude generalisation. Vercel's Knowledge Agent Template extends the same architectural class — agent searches a curated filesystem as its knowledge surface — from the library-API-examples altitude (v0's canonical instance, tight co-maintenance with a specific vendor) to arbitrary enterprise corpora altitude (docs, code, transcripts, APIs via snapshot-repo sync). The commonality: the agent's knowledge view is a read-only filesystem; the difference: the vendor co-maintenance property is replaced by a Workflow-driven sync from an admin-configured source list. Both sit inside the same wiki pattern class but address different knowledge-scope questions.
Related¶
- systems/vercel-v0 — canonical consumer at library-API altitude.
- systems/ai-sdk — canonical collaborator (the library whose team co-maintains the examples).
- systems/vercel-knowledge-agent-template — enterprise-corpus-altitude sibling instance.
- patterns/dynamic-knowledge-injection-prompt — the complementary prompt-text pattern.
- patterns/bash-in-sandbox-as-retrieval-tool — the
enterprise-altitude sibling pattern that consumes the
curated filesystem via
bashtools. - concepts/training-cutoff-dynamism-gap — the underlying failure mode both patterns address.
- concepts/web-search-telephone-game — the alternative-via-web-search failure mode this pattern avoids by curating directly.
- concepts/filesystem-as-retrieval-substrate — parent concept.
- concepts/snapshot-repository-as-agent-corpus — the enterprise-altitude sibling's producer substrate.
- concepts/context-engineering — broader discipline of shaping LLM context.