MCP Todo Server

MCP Todo Server

A distributed task management server that provides todo list operations with AI-powered task prioritization using OpenAI, built on Redis for shared state across multiple nodes.

Category
Visit Server

README

MCP Todo Server

A distributed task management server built on the Model Context Protocol (MCP). Uses TypeScript, Redis for shared state, and OpenAI for smart task analysis.

Features

  • MCP protocol implementation with 6 custom tools
  • Multi-node setup with load balancing via Caddy
  • Redis for distributed state (works across nodes)
  • AI task prioritization using OpenAI
  • Docker Compose for easy deployment
  • Health monitoring and graceful shutdown

Available Tools

  • todo_add - Add new tasks with priority levels
  • todo_list - List todos with status filtering
  • todo_remove - Remove specific tasks
  • todo_clear - Clear all tasks
  • todo_mark_done - Mark tasks as completed
  • todo_analyze - Get AI-powered task prioritization

Architecture

┌─────────────┐
│   Caddy     │  Load Balancer
│  Port 3000  │
└──────┬──────┘
       │
       ├────────────┬────────────┐
       ▼            ▼            ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│  Node 1  │ │  Node 2  │ │  Node N  │
│ Port 3001│ │ Port 3002│ │ Port 300N│
└────┬─────┘ └────┬─────┘ └────┬─────┘
     │            │            │
     └────────────┴────────────┘
                  │
            ┌─────▼──────┐
            │   Redis    │  Shared State
            │  Port 6379 │
            └────────────┘

Getting Started

Prerequisites

  • Docker and Docker Compose
  • Node.js 18+ (for local dev)
  • OpenAI API key

Installation

  1. Clone the repository

    git clone https://github.com/yourusername/mcp-todo-server.git
    cd mcp-todo-server
    
  2. Set up environment variables

    cp .env.example .env
    # Edit .env and add your OPENAI_API_KEY
    
  3. Start with Docker Compose

    docker compose up --build
    

    This starts:

    • Redis on port 6379
    • MCP Server Node 1 on port 3001
    • MCP Server Node 2 on port 3002
    • Caddy load balancer on port 3000
  4. Verify the deployment

    curl http://localhost:3000/health
    

API Endpoints

Health Check

GET http://localhost:3000/health

MCP Endpoint

GET/POST http://localhost:3000/mcp

Uses Server-Sent Events (SSE) for communication.

Development

Running Locally (without Docker)

# Install dependencies
npm install

# Start Redis
docker run -d -p 6379:6379 redis:7-alpine

# Set environment variables
export REDIS_URL=redis://localhost:6379
export OPENAI_API_KEY=your_key_here
export NODE_ID=dev-node

# Run development server
npm run dev

Build for Production

npm run build
npm start

Testing

Run the test suite:

./test.sh

Or try the example client:

npm run test:client

Using with MCP Clients

VS Code / Cursor

Add to your MCP config:

{
  "mcpServers": {
    "todo": {
      "url": "http://localhost:3000/mcp",
      "transport": "sse"
    }
  }
}

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "todo": {
      "command": "node",
      "args": ["/path/to/mcp-todo-server/dist/main.js"],
      "env": {
        "REDIS_URL": "redis://localhost:6379",
        "OPENAI_API_KEY": "your-key-here"
      }
    }
  }
}

Project Structure

.
├── src/
│   ├── main.ts           # Express server & MCP transport
│   ├── mcp-tools.ts      # MCP tool implementations
│   ├── redis-client.ts   # State management
│   └── ai-service.ts     # OpenAI integration
├── docker-compose.yml    # Multi-node orchestration
├── Dockerfile            # Container definition
├── package.json          # Dependencies
├── tsconfig.json         # TypeScript config
├── test.sh              # Automated testing
└── example-client.js    # MCP client example

Security Notes

For production, add:

  • Authentication
  • HTTPS
  • Redis password
  • Input validation
  • Rate limiting

Resources

License

MIT

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured