████████╗██╗ ██╗██╗███╗ ██╗ ╚══██╔══╝██║ ██║██║████╗ ██║ ██║ ██║ █╗ ██║██║██╔██╗ ██║ ██║ ██║███╗██║██║██║╚██╗██║ ██║ ╚███╔███╔╝██║██║ ╚████║ ╚═╝ ╚══╝╚══╝ ╚═╝╚═╝ ╚═══╝
> █
# a single source of truth every agent can read and write.
# hosted api · many surfaces · local-first llms · graph-native. write once, surface everywhere.
// 01 the graph
every document is a node. every extracted fact is an edge. every person, project, and class has exactly one canonical entity, so every mention of the same person across every source resolves to the same graph node. an entity resolver does the work automatically.
// 02 philosophy
most memory systems die because the maintenance cost is higher than the capture cost. twin flips that: you drop anything in from any surface; auto-extractors and an entity resolver do the organizing. every mention of a person, project, or class collapses into one canonical node. retrieval lands in any agent client that speaks mcp, no lock-in to any single host.
1. ingest → raw text, notes, future connectors, zero friction 2. organize → resolve entities, extract facts, audit every decision 3. retrieve → hybrid vector + graph walk via mcp / web chat / cli
// 03 architecture
claude code (mcp) [live] web chat (twin.arshsingh.net) [live] cli [live] // queued: ios app, browser ext, telegram
POST /ingest/raw raw content in POST /ingest/note structured note GET /recall hybrid vector search GET /walk n-hop entity neighbors GET /who_is canonical entity + facts GET /graph/snapshot full graph export POST /mcp model context protocol
extract-facts claude sonnet → triples resolve-entity id + alias + embed + prefix embed openai 3-large → vector audit-log every auto-decision logged
postgres + pgvector docs, facts, history audit log every action, undoable r2 (attachments) [planned]
claude sonnet extract + chat openai 3-large embeddings zero-retention flag on every call pii redaction [queued for p2]
<queued> gmail, gcal <queued> imessage, notion, browser ext <queued> discord, slack, github, rss <queued> voice (whisper), screenshots
// 04 surface
# five mcp tools, live. any agent client that speaks mcp can call them.
| tool | synopsis | purpose |
|---|---|---|
| twin_recall | query [limit] | vector hits across the personal graph |
| twin_who_is | name_or_id | canonical entity + recent mentions |
| twin_remember | text [occurred_at] | manual ingest, auto-extracted to graph |
| twin_walk | node_id [depth] | n-hop neighbors via fact edges |
| twin_inject_context | query [limit] | markdown context blob for any llm |
| entity.merge | src dst | <planned> collapse duplicate canonical nodes |
| entity.undo | audit_id | <planned> revert any auto-action from audit log |
| connector.add | kind [filters] | <planned> authorize gmail, gcal, notion, etc. |
// 05 phases
// 06 join waitlist
# twin is in closed beta. friends first, prosumers next. drop your email.
// 07 field log
$ tail -n 20 /var/log/twin/ingest.log [just now] mcp/twin_recall "Sarah" → 4 hits [2 min ago] resolve-entity: merged "Maya" → [[Maya Patel]] [6 min ago] extract-facts: 11 triples → graph (claude sonnet) [12 min ago] mcp/twin_remember → raw_item created, extraction queued [15 min ago] graph-walk 2 hops → 12 nodes, 22 edges [22 min ago] audit-log: 3 auto-decisions logged, undoable [1 hr ago] mcp/twin_inject_context → markdown blob assembled [4 hr ago] mcp/twin_who_is "Helix" → canonical entity resolved [today] graph density 3.4 edges/node [OK]