Todo MCP
A persistent todo list server that enables AI assistants to manage tasks across different platforms using the Model Context Protocol. It provides tools for creating, listing, updating, and deleting todos with support for priorities, tags, and due dates.
README
Todo MCP
A persistent todo list accessible from any AI assistant that supports MCP. One server, every client.
Works with Claude (chat, Code, Cowork), Codex, and any MCP-compatible tool. ChatGPT support is possible too, with one current auth caveat noted below.
Quick start (local)
git clone https://github.com/YOUR_USER/todo-mcp.git
cd todo-mcp
cp .env.example .env
docker compose up -d
MCP endpoint: http://localhost:8000/mcp
Deploy on a remote server
# 1. Copy the folder to your server
scp -r ~/Downloads/todo-mcp user@YOUR_SERVER:~/todo-mcp
# 2. SSH in and configure
ssh user@YOUR_SERVER
cd ~/todo-mcp
cp .env.example .env
# 3. Generate an auth token and add it to .env
python3 -c "import secrets; print(secrets.token_urlsafe(32))"
# → paste the output as AUTH_TOKEN=... in .env
# 4. Start
docker compose up -d
HTTP endpoint: http://YOUR_SERVER:8000/mcp
Add HTTPS (recommended for remote servers)
Self-signed certs are blocked by many corporate firewalls (e.g. FortiGuard). Use Let's Encrypt for a free, trusted certificate.
1. Point a DNS A record at your server:
| Type | Name | Value |
|---|---|---|
| A | todo |
YOUR_SERVER_IP |
2. Get a certificate (requires port 80 free temporarily):
certbot certonly --standalone -d todo.yourdomain.com
If port 80 is occupied: stop whatever is using it, run certbot, then restart it.
3. Edit nginx.conf — replace yourdomain.com with your actual domain in:
server_namessl_certificatessl_certificate_key
4. Start with HTTPS:
docker compose --profile https up -d
If port 443 is already taken, set
HTTPS_PORT=8443in.env. Your endpoint becomeshttps://todo.yourdomain.com:8443/mcp.
5. Open the firewall if needed:
ufw allow 443/tcp # or 8443/tcp if using a custom port
Note: The nginx container mounts all of
/etc/letsencrypt(not justlive/), because Let's Encrypt certs are symlinks intoarchive/. Mounting onlylive/breaks symlink resolution inside the container.
HTTPS endpoint: https://todo.yourdomain.com/mcp
Connect your AI clients
Use the full MCP endpoint URL here: https://todo.yourdomain.com/mcp
For the companion slash commands / skill in skills/, use the base server URL
without /mcp, for example: https://todo.yourdomain.com
Claude (claude.ai)
Settings → Integrations → Add custom integration
- URL:
https://todo.yourdomain.com/mcp - If auth is enabled, add header:
Authorization: Bearer YOUR_TOKEN
Claude Code
Add to ~/.claude.json:
{
"mcpServers": {
"todo": {
"type": "url",
"url": "https://todo.yourdomain.com/mcp",
"headers": {
"Authorization": "Bearer YOUR_TOKEN"
}
}
}
}
Codex (CLI, app, IDE extension)
Codex uses ~/.codex/config.toml (or project-local .codex/config.toml).
The Codex CLI and app share this config.
No auth:
[mcp_servers.todo]
url = "https://todo.yourdomain.com/mcp"
Bearer token auth:
[mcp_servers.todo]
url = "https://todo.yourdomain.com/mcp"
bearer_token_env_var = "TODO_MCP_TOKEN"
Then export the token before starting Codex:
export TODO_MCP_TOKEN="YOUR_TOKEN"
You can also use static headers with http_headers or environment-backed
headers with env_http_headers.
ChatGPT
As of March 26, 2026, OpenAI's ChatGPT developer mode supports remote MCP servers over streaming HTTP, but the documented auth modes are OAuth, no authentication, and mixed auth. This server currently supports no auth or bearer token auth.
That means:
AUTH_TOKENempty: easiest path for ChatGPT testingAUTH_TOKENset: works in Claude / Codex, but ChatGPT may not connect unless you add OAuth support
This is an inference from OpenAI's current docs: bearer-only remote MCP servers do not appear to be the smooth path in ChatGPT today.
If you want ChatGPT support right now, leave AUTH_TOKEN empty during testing
or place the server behind another trusted access layer. If you want secure
internet-facing ChatGPT access, the next step is adding OAuth.
Any other MCP client
Streamable HTTP transport, URL https://todo.yourdomain.com/mcp.
Include header Authorization: Bearer YOUR_TOKEN if auth is enabled.
Slash commands and skill triggers
Install the companion skill from the skills/ folder:
| Command | Also triggers on |
|---|---|
/setup-todo <url> |
"setup todo", "configure todo", "set todo url" |
/todo <text> |
"add todo", "remind me to", "note to self", "don't let me forget" |
/next-todo |
"what's next", "what should I work on", "most urgent", "what's overdue" |
/list-todo [filter] |
"show my todos", "what's on my plate", "todo stats", "show ASF todos" |
cd skills && chmod +x install.sh && ./install.sh
/setup-todo https://todo.yourdomain.com
For Codex, see:
codex-config.toml.exampleskills/AGENTS.md.example
MCP tools
| Tool | What it does |
|---|---|
todo_list |
List todos with short #1, #2, #3 references for follow-up actions |
todo_add |
Add a todo with optional tag, priority, and due date |
todo_update |
Edit text, toggle done, change priority / tag / due using a short reference or internal ID |
todo_complete |
Mark a todo done using a short reference or internal ID |
todo_delete |
Permanently delete a todo using a short reference or internal ID |
todo_stats |
Summary counts by status, tag, priority, and overdue |
Example flow:
Found 3 todos — 3 open, 0 done, 0 overdue
Use the `#` number to refer to an item in follow-up commands.
#1 ○ Buy groceries
priority:high due:2026-03-27
#2 ○ Review pages and letter of HubX
#3 ○ Read MCP spec (streamable HTTP section)
priority:med tag:dev
That lets the assistant say things like complete 1 or delete 2 without
surfacing the long internal storage IDs in the normal presentation.
Authentication
Set AUTH_TOKEN in .env to require a bearer token on all requests.
Leave it empty to disable auth (fine for local, not recommended for remote).
Compatibility summary:
- Claude / Claude Code: works with bearer auth
- Codex: works with bearer auth
- ChatGPT: easiest with no auth today; OAuth would be needed for the cleanest authenticated setup
# Generate a token
python3 -c "import secrets; print(secrets.token_urlsafe(32))"
The /health endpoint is always public (used by Docker healthchecks).
Architecture
- Protocol: MCP over streamable HTTP
- Runtime: Python 3.12 + FastMCP
- Storage: JSON file in a Docker named volume (
/data/todos.json) - Auth: Optional bearer token middleware
- TLS: Optional nginx reverse proxy (
--profile https)
Notes for OpenAI clients
I verified the current OpenAI docs before writing this section:
- Codex supports streamable HTTP MCP servers, bearer tokens,
http_headers, andenv_http_headers - ChatGPT developer mode supports streaming HTTP MCP servers, but its docs currently describe OAuth / no auth / mixed auth rather than arbitrary bearer header entry in the UI
Sources:
Backup & restore
# Backup
docker cp $(docker compose ps -q todo-mcp):/data/todos.json ./backup.json
# Restore
docker cp ./backup.json $(docker compose ps -q todo-mcp):/data/todos.json
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.