MCP Crew AI Server
A lightweight Python-based server designed to run, manage and create CrewAI workflows using the Model Context Protocol for communicating with LLMs and tools like Claude Desktop or Cursor IDE.
adam-paterson
README
<div align="center"> <img src="https://github.com/crewAIInc/crewAI/blob/main/docs/crewai_logo.png" alt="CrewAI Logo" /> </div>
MCP Crew AI Server
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows. This project leverages the Model Context Protocol (MCP) to communicate with Large Language Models (LLMs) and tools such as Claude Desktop or Cursor IDE, allowing you to orchestrate multi-agent workflows with ease.
Features
- Automatic Configuration: Automatically loads agent and task configurations from two YAML files (
agents.yml
andtasks.yml
), so you don't need to write custom code for basic setups. - Command Line Flexibility: Pass custom paths to your configuration files via command line arguments (
--agents
and--tasks
). - Seamless Workflow Execution: Easily run pre-configured workflows through the MCP
run_workflow
tool. - Local Development: Run the server locally in STDIO mode, making it ideal for development and testing.
Installation
There are several ways to install the MCP Crew AI server:
Option 1: Install from PyPI (Recommended)
pip install mcp-crew-ai
Option 2: Install from GitHub
pip install git+https://github.com/adam-paterson/mcp-crew-ai.git
Option 3: Clone and Install
git clone https://github.com/adam-paterson/mcp-crew-ai.git
cd mcp-crew-ai
pip install -e .
Requirements
- Python 3.11+
- MCP SDK
- CrewAI
- PyYAML
Configuration
- agents.yml: Define your agents with roles, goals, and backstories.
- tasks.yml: Define tasks with descriptions, expected outputs, and assign them to agents.
Example agents.yml
:
zookeeper:
role: Zookeeper
goal: Manage zoo operations
backstory: >
You are a seasoned zookeeper with a passion for wildlife conservation...
Example tasks.yml
:
write_stories:
description: >
Write an engaging zoo update capturing the day's highlights.
expected_output: 5 engaging stories
agent: zookeeper
output_file: zoo_report.md
Usage
Once installed, you can run the MCP CrewAI server using either of these methods:
Standard Python Command
mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Using UV Execution (uvx)
For a more streamlined experience, you can use the UV execution command:
uvx mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Or run just the server directly:
uvx mcp-crew-ai-server
This will start the server using default configuration from environment variables.
Command Line Options
--agents
: Path to the agents YAML file (required)--tasks
: Path to the tasks YAML file (required)--topic
: The main topic for the crew to work on (default: "Artificial Intelligence")--process
: Process type to use (choices: "sequential" or "hierarchical", default: "sequential")--verbose
: Enable verbose output--variables
: JSON string or path to JSON file with additional variables to replace in YAML files--version
: Show version information and exit
Advanced Usage
You can also provide additional variables to be used in your YAML templates:
mcp-crew-ai --agents examples/agents.yml --tasks examples/tasks.yml --topic "Machine Learning" --variables '{"year": 2025, "focus": "deep learning"}'
These variables will replace placeholders in your YAML files. For example, {topic}
will be replaced with "Machine Learning" and {year}
with "2025".
Contributing
Contributions are welcome! Please open issues or submit pull requests with improvements, bug fixes, or new features.
Licence
This project is licensed under the MIT Licence. See the LICENSE file for details.
Happy workflow orchestration!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.