Open source · MIT · Node ≥20 · Local-first

Your AI coding history, indexed across every tool

xtctx ingests conversations from Claude Code, Cursor, Copilot, Codex, and Gemini, indexes them locally with hybrid search (BM25 + embeddings), and exposes recall and writeback over MCP. Switch tools mid-project without re-briefing the model.

npm install -g xtctx && xtctx init && xtctx serve

What it actually does

Five tools, one local index, one continuity policy. Full README →

Cross-tool recall via MCP

xtctx serve runs an MCP server over stdio. Your assistant calls xtctx_search and xtctx_project_knowledge at session start to recall what you already decided, debugged, and shipped — regardless of which tool you used last week.

// session opener xtctx_search("auth error after last deploy") xtctx_project_knowledge({ type: "all" }) // after coding xtctx_save_decision({ title, rationale, alternatives_considered }) xtctx_save_error_solution({ error, solution, context })

Hybrid search (BM25 + vector)

Conversations and structured knowledge land in LanceDB. Queries fuse full-text and embedding similarity via Reciprocal Rank Fusion, with hybrid, semantic, and keyword modes selectable per call. The same pipeline backs the MCP recall tools and the runtime web UI search bar.

$ curl -s localhost:3232/api/search?q=lancedb+routing&mode=hybrid | jq '.[0]' { "score": 0.91, "source": "claude-code", "type": "decision", "title": "Route LanceDB writes through transaction guard", "snippet": "...prevents partial-write corruption on..." }

Continuity policy across five tools

One shared.yaml declares the context feed, skills, commands, agents, MCP servers, slash commands, and whitelist policy you want present in every tool. xtctx sync renders that into Claude Code, Cursor, Copilot, Codex, and Gemini in their native formats. xtctx serve auto-reconciles drift on a timer.

# .xtctx/tool-config/shared.yaml scope: project context_feed: session_opener: [xtctx_search, xtctx_project_knowledge] writeback_tools: [xtctx_save_decision, xtctx_save_error_solution] mcp_servers: [xtctx] whitelist_policy: advisory_level: warn

Local-first, no SaaS

Everything lives under .xtctx/ in your repo and ~/.xtctx/ for the global baseline. Embeddings run in-process via @xenova/transformers; the index is LanceDB on disk. No telemetry, no cloud calls, no account. Your conversation history never leaves your machine.

$ ls .xtctx/ config.yaml knowledge/ lancedb/ tool-config/ $ xtctx serve MCP stdio API http://127.0.0.1:3232/api UI http://127.0.0.1:3232/ No outbound network calls.

Get started

Install, init, then serve. Quick start →

npm (global)
npm install -g xtctx
Requires Node ≥20. Installs the xtctx CLI on your PATH.
npx (no install)
npx xtctx init && npx xtctx serve
Try it without committing to a global install. Same binary, fetched on demand.
Bootstrap a project
xtctx init && xtctx sync && xtctx serve
Scaffolds .xtctx/, renders continuity blocks into your AI tools, then starts the MCP + API + runtime UI.
Full re-index
xtctx ingest --full
Rebuilds the LanceDB index from every conversation file the scrapers can find. Use after upgrading or after a long offline stretch.
From source
git clone https://github.com/fstubner/xtctx && cd xtctx && npm ci && npm run build
Useful for development. Then node dist/src/cli/index.js serve.

Or read the full README for hooks, policy merging, and MCP client config.

Try it

$ xtctx init $ xtctx sync $ xtctx serve $ xtctx ingest --full $ xtctx --help

FAQ

The questions people actually ask before installing.

Does it call out to any cloud service?
No. xtctx is local-first. Embeddings are computed in-process via @xenova/transformers, the search index is LanceDB on disk, and conversation history is read from the AI tools' own local storage. There is no telemetry, no account, and no outbound network calls during normal operation. The only network access is the GitHub stars/downloads counter on this landing page itself.
Which tools are supported?
Five: Claude Code, Cursor, GitHub Copilot, Codex, and Gemini CLI. Each has a scraper that reads the tool's native conversation storage, plus a sync target that renders the shared continuity policy into the tool's native config format (CLAUDE.md, .cursor/rules, MCP server config, etc.).
How does it handle drift in tool storage formats?
Two layers. A mutation suite under tests/drift/ snapshots the parser output against fixtures from each tool, so a format change shows up as a failing test on the next CI run. A nightly canary workflow runs the live CLIs (Claude Code, Codex, Gemini) end-to-end against current versions to catch breakage that fixtures alone would miss. Cursor and Copilot are GUI tools so the canary covers three of five; the mutation suite covers all five.
What about Cursor and Copilot — those are GUI tools, not CLIs?
Right, which is why the live CLI canary only covers three of the five tools. For the GUI tools we rely on the mutation suite — fixtures captured from real Cursor and Copilot installs, replayed against the parser on every CI run. If Cursor changes its conversation storage format, fixtures stop matching and the test fails before it ships.
How is xtctx different from just having a long context window?
A long context window is per-session and per-tool. xtctx is persistent and cross-tool. When you start a Cursor session tomorrow morning, the recall tools surface decisions you made in Claude Code last week, error solutions you saved from Codex two months ago, and the project conventions you wrote up once and never want to re-derive. The context window holds the current conversation; xtctx holds project memory.
What does the MCP integration look like in practice?
You add xtctx to your assistant's MCP server config (mcpServers.xtctx with command: "xtctx" and args: ["serve"]), and the assistant gets recall and writeback tools: xtctx_search, xtctx_project_knowledge, xtctx_recent_sessions for reading; xtctx_save_decision, xtctx_save_error_solution, xtctx_save_faq for writing. xtctx sync also generates SessionStart hooks for Claude Code so recall fires automatically at the top of every session.
Is it open source?
Yes. xtctx is MIT-licensed. Source, issue tracker, and releases are at github.com/fstubner/xtctx. Releases are published to npm via OIDC trusted publishing on every GitHub Release.