MCP Todo Server
A distributed task management server that provides todo list operations with AI-powered task prioritization using OpenAI, built on Redis for shared state across multiple nodes.
README
MCP Todo Server
A distributed task management server built on the Model Context Protocol (MCP). Uses TypeScript, Redis for shared state, and OpenAI for smart task analysis.
Features
- MCP protocol implementation with 6 custom tools
- Multi-node setup with load balancing via Caddy
- Redis for distributed state (works across nodes)
- AI task prioritization using OpenAI
- Docker Compose for easy deployment
- Health monitoring and graceful shutdown
Available Tools
todo_add- Add new tasks with priority levelstodo_list- List todos with status filteringtodo_remove- Remove specific taskstodo_clear- Clear all taskstodo_mark_done- Mark tasks as completedtodo_analyze- Get AI-powered task prioritization
Architecture
┌─────────────┐
│ Caddy │ Load Balancer
│ Port 3000 │
└──────┬──────┘
│
├────────────┬────────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Node 1 │ │ Node 2 │ │ Node N │
│ Port 3001│ │ Port 3002│ │ Port 300N│
└────┬─────┘ └────┬─────┘ └────┬─────┘
│ │ │
└────────────┴────────────┘
│
┌─────▼──────┐
│ Redis │ Shared State
│ Port 6379 │
└────────────┘
Getting Started
Prerequisites
- Docker and Docker Compose
- Node.js 18+ (for local dev)
- OpenAI API key
Installation
-
Clone the repository
git clone https://github.com/yourusername/mcp-todo-server.git cd mcp-todo-server -
Set up environment variables
cp .env.example .env # Edit .env and add your OPENAI_API_KEY -
Start with Docker Compose
docker compose up --buildThis starts:
- Redis on port 6379
- MCP Server Node 1 on port 3001
- MCP Server Node 2 on port 3002
- Caddy load balancer on port 3000
-
Verify the deployment
curl http://localhost:3000/health
API Endpoints
Health Check
GET http://localhost:3000/health
MCP Endpoint
GET/POST http://localhost:3000/mcp
Uses Server-Sent Events (SSE) for communication.
Development
Running Locally (without Docker)
# Install dependencies
npm install
# Start Redis
docker run -d -p 6379:6379 redis:7-alpine
# Set environment variables
export REDIS_URL=redis://localhost:6379
export OPENAI_API_KEY=your_key_here
export NODE_ID=dev-node
# Run development server
npm run dev
Build for Production
npm run build
npm start
Testing
Run the test suite:
./test.sh
Or try the example client:
npm run test:client
Using with MCP Clients
VS Code / Cursor
Add to your MCP config:
{
"mcpServers": {
"todo": {
"url": "http://localhost:3000/mcp",
"transport": "sse"
}
}
}
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"todo": {
"command": "node",
"args": ["/path/to/mcp-todo-server/dist/main.js"],
"env": {
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
Project Structure
.
├── src/
│ ├── main.ts # Express server & MCP transport
│ ├── mcp-tools.ts # MCP tool implementations
│ ├── redis-client.ts # State management
│ └── ai-service.ts # OpenAI integration
├── docker-compose.yml # Multi-node orchestration
├── Dockerfile # Container definition
├── package.json # Dependencies
├── tsconfig.json # TypeScript config
├── test.sh # Automated testing
└── example-client.js # MCP client example
Security Notes
For production, add:
- Authentication
- HTTPS
- Redis password
- Input validation
- Rate limiting
Resources
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.