
Task Manager MCP Server
A Model Context Protocol server providing comprehensive task management capabilities with support for project organization, task tracking, and automatic PRD parsing into actionable items.
README
Task Manager MCP Server
A template implementation of the Model Context Protocol (MCP) server for managing tasks and projects. This server provides a comprehensive task management system with support for project organization, task tracking, and PRD parsing.
Overview
This project demonstrates how to build an MCP server that enables AI agents to manage tasks, track project progress, and break down Product Requirements Documents (PRDs) into actionable tasks. It serves as a practical template for creating your own MCP servers with task management capabilities.
The implementation follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.
Features
The server provides several essential task management tools:
-
Task Management
create_task_file
: Create new project task filesadd_task
: Add tasks to projects with descriptions and subtasksupdate_task_status
: Update the status of tasks and subtasksget_next_task
: Get the next uncompleted task from a project
-
Project Planning
parse_prd
: Convert PRDs into structured tasks automaticallyexpand_task
: Break down tasks into smaller, manageable subtasksestimate_task_complexity
: Estimate task complexity and time requirementsget_task_dependencies
: Track task dependencies
-
Development Support
generate_task_file
: Generate file templates based on task descriptionssuggest_next_actions
: Get AI-powered suggestions for next steps
Prerequisites
- Python 3.12+
- API keys for your chosen LLM provider (OpenAI, OpenRouter, or Ollama)
- Docker if running the MCP server as a container (recommended)
Installation
Using uv
-
Install uv if you don't have it:
pip install uv
-
Clone this repository:
git clone https://github.com/coleam00/mcp-mem0.git cd mcp-mem0
-
Install dependencies:
uv pip install -e .
-
Create a
.env
file based on.env.example
:cp .env.example .env
-
Configure your environment variables in the
.env
file (see Configuration section)
Using Docker (Recommended)
-
Build the Docker image:
docker build -t mcp/mem0 --build-arg PORT=8050 .
-
Create a
.env
file based on.env.example
and configure your environment variables
Configuration
The following environment variables can be configured in your .env
file:
Variable | Description | Example |
---|---|---|
TRANSPORT |
Transport protocol (sse or stdio) | sse |
HOST |
Host to bind to when using SSE transport | 0.0.0.0 |
PORT |
Port to listen on when using SSE transport | 8050 |
LLM_PROVIDER |
LLM provider (openai, openrouter, or ollama) | openai |
LLM_BASE_URL |
Base URL for the LLM API | https://api.openai.com/v1 |
LLM_API_KEY |
API key for the LLM provider | sk-... |
LLM_CHOICE |
LLM model to use for task analysis | gpt-4 |
Running the Server
Using Python 3
# Set TRANSPORT=sse in .env then:
python3 src/main.py
The server will start on the configured host and port (default: http://0.0.0.0:8050).
Using Docker
docker build -t task-manager-mcp .
docker run --env-file .env -p 8050:8050 task-manager-mcp
Using the Task Manager
Creating a New Project
- Create a task file for your project:
await mcp.create_task_file(project_name="my-project")
- Add tasks to your project:
await mcp.add_task(
project_name="my-project",
title="Setup Development Environment",
description="Configure the development environment with required tools",
subtasks=[
"Install dependencies",
"Configure linters",
"Set up testing framework"
]
)
- Parse a PRD to create tasks automatically:
await mcp.parse_prd(
project_name="my-project",
prd_content="# Your PRD content..."
)
Managing Tasks
- Update task status:
await mcp.update_task_status(
project_name="my-project",
task_title="Setup Development Environment",
subtask_title="Install dependencies",
status="done"
)
- Get the next task to work on:
next_task = await mcp.get_next_task(project_name="my-project")
- Expand a task into subtasks:
await mcp.expand_task(
project_name="my-project",
task_title="Implement Authentication"
)
Development Workflow
- Generate a file template for a task:
await mcp.generate_task_file(
project_name="my-project",
task_title="User Authentication"
)
- Get task complexity estimate:
complexity = await mcp.estimate_task_complexity(
project_name="my-project",
task_title="User Authentication"
)
- Get suggestions for next actions:
suggestions = await mcp.suggest_next_actions(
project_name="my-project",
task_title="User Authentication"
)
Integration with MCP Clients
SSE Configuration
To connect to the server using SSE transport, use this configuration:
{
"mcpServers": {
"task-manager": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}
Stdio Configuration
For stdio transport, use this configuration:
{
"mcpServers": {
"task-manager": {
"command": "python3",
"args": ["src/main.py"],
"env": {
"TRANSPORT": "stdio",
"LLM_PROVIDER": "openai",
"LLM_API_KEY": "YOUR-API-KEY",
"LLM_CHOICE": "gpt-4"
}
}
}
}
Building Your Own Server
This template provides a foundation for building more complex task management MCP servers. To extend it:
- Add new task management tools using the
@mcp.tool()
decorator - Implement custom task analysis and automation features
- Add project-specific task templates and workflows
- Integrate with your existing development tools and processes
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.