KARPATHY

Marketplace/
Memory
System Ready
@AndrejKarpathy

Karpathy WikiKarpathy WikiKarpathy Wiki

Active Memory OS: A layered symbiosis integrating onde with the Karpathy Wiki pattern. Human-readable markdown meets agent-precise typed graphs and TOON density.

Terminal Inject
❯onde init --template karpathy-wiki
View Registry Source
Global Installs 21.4k
Nodes / Edges 4 / 3

Architecture Manifesto

Integrating onde with the Karpathy LLM Wiki pattern isn't a replacementβ€”it's a layered symbiosis. onde becomes the deterministic, typed, journal-backed kernel, while the Karpathy skills become the human-facing, semantic, plugin-driven UI layer.

Here’s exactly how onde would accommodate, extend, and operationalize the Karpathy wiki skill pattern in production.


🧱 1. Architectural Mapping: Karpathy β†’ onde

Karpathy Componentonde Equivalent / AdapterRole in Hybrid System
SCHEMA.mdschema.md + CLAUDE.mdschema.md enforces typed graph rules; CLAUDE.md holds human-facing conventions
wiki/pages/*.mdMaterialized views from CRDT journalHuman-readable cache; auto-rebuilt via onde materialize
wiki/index.mdTOON-dense catalog + graph indexAgent reads first; routes queries to onde or LLM context
wiki/log.mdonde journal --tail + Git commit historyAppend-only event log; versioned via HLC
skills/*.mdonde CLI wrappers + skill adaptersTranslate natural language triggers β†’ deterministic onde pipes
Raw raw/ sources`onde ingest <fileurl>` β†’ journal append

Core Principle: onde owns the Single Source of Truth (journal). Karpathy owns the human-facing projection (markdown). Skills become adapters between the two.


πŸ”Œ 2. Skill-by-Skill Accommodation

🟦 wiki-init β†’ onde init -t wiki

onde init --template karpathy-wiki

What it does:

  1. Creates raw/, wiki/pages/, wiki/assets/
  2. Generates a default schema.md with Karpathy conventions:
## Node Types
### concept
  title: string @required
  type: enum(concept, entity, summary, comparison)
  sources: edge(source) @many
  related: edge(concept) @many
  confidence: enum(high, medium, low) @default(medium)
  1. Generates CLAUDE.md with Karpathy workflow prompts
  2. Initializes CRDT journal (journal/) + Git repo
  3. Sets up skill adapters in .claude/skills/

🟦 wiki-ingest β†’ onde ingest + materialize

Karpathy: Agent reads source β†’ discusses β†’ writes wiki/pages/ β†’ updates index.md β†’ appends log.md

onde Accommodation:

onde ingest raw/papers/transformer.pdf --extract --discuss
  1. Parse: onde extracts claims, entities, citations β†’ writes to journal as typed events
  2. Discuss: Pipe to LLM for human validation: onde journal --last | ollama run llama3 "Summarize & flag contradictions"
  3. Materialize: onde materialize --target wiki/pages/ generates flat markdown from journal
  4. Backlink: onde lint --backlinks auto-adds [[links]] to index.md and related pages
  5. Log: onde journal append --type ingest --source transformer.pdf

Key Gain: Ingest becomes idempotent & replayable. Delete wiki/? onde materialize rebuilds it perfectly.


🟦 wiki-query β†’ Hybrid Query Router

Karpathy relies on LLM context for all queries. onde adds deterministic routing:

/wiki-query "What routing strategies are used in MoE models?"

onde Router Logic:

  1. Check index.md (TOON format) for keywords
  2. If query contains logical operators (blocks, depends, status, type) β†’ route to onde find
  3. If query is semantic/synthetic β†’ route to LLM context
  4. Return unified answer with citations

Examples:

# Logical β†’ onde handles directly
onde find concept --related "mixture-of-experts" --format toon | grep routing

# Semantic β†’ onde exports context β†’ LLM reasons
onde export concept:mixture-of-experts | claude "Compare routing strategies"

Token Savings: onde filters 90% of irrelevant nodes before LLM sees anything. Context window used only for synthesis.

Vectors fail at hard reasoning. When an agent uses this topology, relations become deterministic traversing rules rather than fuzzy distance metrics. Ensure onde agent --sync is running when modifying nodes manually to keep the CRDT journal perfectly aligned.

Optimal Use Cases

  • ❯Building an active memory OS
  • ❯Merging human-editable Markdown with rigid AI topologies
  • ❯Requiring TOON-dense indexes to save LLM context window
OS Folder Mapping
Concept: Folders = Types, Markdown = Nodes
[agent-os] ~ /karpathy-world
β”œβ”€β”€ schema.md
β”œβ”€β”€ concepts/
β”‚ β”œβ”€β”€ concept-01.md
β”‚ └── concept-02.md
β”‚
β”œβ”€β”€ sources/
β”‚ β”œβ”€β”€ source-01.md
β”‚ └── source-02.md
β”‚
β”œβ”€β”€ summarys/
β”‚ β”œβ”€β”€ summary-01.md
β”‚ └── summary-02.md
β”‚
β”œβ”€β”€ comparisons/
β”‚ β”œβ”€β”€ comparison-01.md
β”‚ └── comparison-02.md
Inference Engine Maps:
/concepts/
@type(concept)
/sources/
@type(source)
/summarys/
@type(summary)
...and 1 more sub-types
onde_viewer
schema.md
12345678910111213
## Node Types
### concept
id: string @id @default(uuid())
title: string @required
type: enum(concept, entity, summary, comparison)
sources: edge(source) @many
related: edge(concept) @many
confidence: enum(high, medium, low) @default(medium)
### source
id: string @id
url: string
Entity_Example.md
## Concept: Mixture of Experts
type:: concept
confidence:: high
The routing layer relies on sparse activation.
edges:
sources: [[source-moe-paper]]
related: [[concept-attention]]

Typed Graph Topology

N0
concept
N1
source
N2
summary
N3
comparison

Kernel Parameters

Compatibility
onde Core v1.4+
Language Bindings
Bash, TS, Python
Target System Ready

Similar TopologiesMemory

Palace
Memory
06
Chat Memory:Passive diary & structured memory spanning temporal events. Ideal for conversational agents that need to recall past interactions accurately.
Topology
wingβ–Άroomβ–Άdrawer
Installs 14.2k
onde init -t mempalace