passoff

passoff

Enables handoff of context between AI coding agents like Claude Code, Cursor, Codex, and Windsurf with built-in provenance tracking.

Category
Visit Server

README

Passoff

Pass the baton between AI coding agents — without losing the thread.

CI License: MIT Node Status: alpha

Memory tools store everything. Passoff marks the moments that matter. One command to hand off context between Claude Code, Cursor, Codex, Windsurf, and any MCP client — with full provenance of which AI sent what to whom.

<!-- SCREENSHOT: drop a hero gif or screenshot at docs/screenshots/hero.png and uncomment. passoff in action -->

Why

  • Local-first. SQLite at ~/.passoff/db.sqlite. No cloud, no accounts, no API keys, no telemetry.
  • Provenance built in. Every handoff records from_client, from_model, to_client, timestamps. The MCP protocol stamps the client name — the AI can't lie about which tool wrote it.
  • Lineage built in. Handoffs reference parents, forming a traceable chain across multiple AIs and sessions.
  • One job. Handoffs. Not memory, not orchestration, not a dashboard.

30-second demo

In Claude Code, finishing a feature:

You: /passoff — switching to Cursor for the UI work

Claude writes a structured markdown handoff, returns { id: "ph_aB3x9K" }.

Open Cursor, start a new chat:

You: /passoff-load

Cursor calls passoff_load({ latest: true }), gets the markdown, and resumes mid-thought.

Trace the chain later:

$ passoff thread ph_GuqtDKDO
o ph_Mo12sReh  Refactor auth middleware
|  claude-code -> cursor  [loaded]
|
* ph_GuqtDKDO  Wired auth into login
|  cursor -> codex  [loaded]
|
o ph_T0s2ZQvD  Fixed login tests
   codex  [open]

<!-- SCREENSHOT: docs/screenshots/thread.png — terminal showing passoff thread <id> output. SCREENSHOT: docs/screenshots/load.png — AI client reading a loaded handoff. -->

Install

npm i -g passoff

Requires Node 20+.

Or run from source:

git clone https://github.com/TheMrGU/passoff && cd passoff
npm install && npm run build
node dist/index.js --help

One-shot setup

If the repo is checked out locally, this wires Passoff into every supported client in one step:

npm run setup

It runs npm install + npm run build, registers the MCP server with Claude Code (via claude mcp add if installed), writes/merges ~/.cursor/mcp.json, ~/.codeium/windsurf/mcp_config.json, ~/.codex/config.toml, and installs the slash-command templates. Restart each AI client afterward.

Wire it into your MCP client manually

Claude Code

claude mcp add passoff -- npx -y passoff serve

Cursor — ~/.cursor/mcp.json (or .cursor/mcp.json in a project)

{
  "mcpServers": {
    "passoff": { "command": "npx", "args": ["-y", "passoff", "serve"] }
  }
}

Codex — ~/.codex/config.toml

[mcp_servers.passoff]
command = "npx"
args = ["-y", "passoff", "serve"]

Install Codex-native slash commands:

passoff install --client codex

This registers /passoff:create, /passoff:load, /passoff:list, and /passoff:search as a local Codex plugin.

Windsurf, others

Same stdio command pair — see docs/clients.md.

Slash commands

After passoff install, every supported client gets the same four commands (Codex uses /passoff:create etc.; Claude Code and Cursor use /passoff, /passoff-load, /passoff-list, /passoff-search).

Command What it does
/passoff (create) Outgoing AI writes a structured handoff and returns its id
/passoff-load Incoming AI loads the most recent active handoff (or one by id)
/passoff-list Browse recent handoffs in the current project
/passoff-search Full-text search across handoffs

MCP tools

Tool Purpose
passoff_create Outgoing AI writes a structured handoff
passoff_load Incoming AI loads by id or latest: true
passoff_list Browse handoffs in a project (metadata only)
passoff_search Full-text search across handoffs (FTS5)
passoff_thread Show the lineage chain of a handoff

Operator CLI

passoff serve                   # run MCP server (stdio) — invoked by clients
passoff list [--project X]      # recent handoffs table
passoff show <id>               # full markdown + metadata
passoff thread <id>             # lineage tree
passoff archive <id>            # mark archived
passoff delete <id>             # hard delete (confirms)
passoff clear [--project X]     # archive all open in a project (confirms)
passoff doctor                  # DB path, version, row counts, MCP config hints
passoff install                 # install Claude/Cursor/Codex command templates
passoff uninstall               # remove installed command templates

Storage & scoping

  • DB location: ~/.passoff/db.sqlite (override with PASSOFF_DB_PATH).
  • Project scoping: derived from PASSOFF_PROJECT_ROOT, else the nearest .git directory, else cwd. Slug is <dir>-<6char hash> so two api/ directories never collide.
  • No telemetry. No network calls. Open the DB in any SQLite browser.

Passoff vs shared-memory tools

Both have a place. They solve different problems.

Passoff Shared memory tool
Primitive Discrete, deliberate handoff event Always-on memory pool
Trigger You type /passoff Implicit, every prompt
What the next AI sees Exactly what the previous AI chose to pass Whatever retrieval surfaced
Provenance Built into the row (client, model, parent) Bolt-on metadata
Cleanup Archive or delete a handoff atom Chase down embeddings
Best for Switching tools mid-thought; tracing decisions Recalling "what did we do six weeks ago"

Memory tools answer "what might be relevant?". Passoff answers "here is exactly what to load next."

Full positioning: docs/design.md.

FAQ

Where does my data live? A single SQLite file at ~/.passoff/db.sqlite. Open it with any SQLite browser. There is no cloud component.

Does Passoff send anything over the network? No. The MCP server is stdio-only. The CLI does not phone home. The setup script writes local config files only.

Can two projects share the same DB? Yes — every handoff is scoped to a project slug. The DB is global; the views are per-project by default.

Can two AIs collide on the same handoff? No. Loading flips status open → loaded and stamps to_client, but the row is re-loadable across sessions until you archive it.

Can I use it with [some other MCP client]? If the client speaks MCP over stdio, yes — wire it the same way. See docs/clients.md.

Why not just paste the conversation? Three reasons: (1) the receiving AI gets exactly what you chose to pass, not the noise; (2) the row is auditable — from_client, from_model, timestamp; (3) lineage threads across multi-step workflows.

Status

Alpha. APIs and storage shape may shift before 1.0. The DB self-migrates additively, but expect occasional churn. Filing issues and PRs is welcome — see below.

Contributing

See CONTRIBUTING.md. TL;DR: npm install && npm test, keep PRs focused, don't bundle a memory layer or an orchestrator into Passoff.

License

MIT — see LICENSE.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured