auto-mcp
Automatically convert functions, tools and agents to MCP servers
NapthaAI
README
auto-mcp
Automatically convert functions, tools and agents to MCP servers.
🧩 Installing auto-mcp
You can install the SDK using PyPI or from source.
Install within an existing project
If you want to install auto-mcp as part of an existing project (e.g. to automatically convert existing agents or tools to MCP servers), it is good practice to do so within a dedicated virtual environment.
1. (Optional) Create a new virtual environment
If you don't already have a virtual environment, create a new one using uv
:
uv init
source .venv/bin/activate
2. Add auto-mcp to your dependencies
Then add auto-mcp. In uv
this looks like:
uv add auto-mcp
If not using uv
or poetry
, you can also use pip:
pip install auto-mcp
3. Initialize automcp
Use the CLI to run the following command:
automcp init
This will create an automcp.py
file at the root of your project.
4. Modify the automcp.py
file
Modify the automcp.py
file to:
- Import the agents or tools you would like to convert from your existing project
- Import the adapters from
automcp
that correspond to the agent framework that you are using - Define an input schema for the agent or tool
A simple example automcp.py
for a CrewAI agent might look like:
from marketing_posts.crew import MarketingPostsCrew
from automcp import crewai_adapter
class InputSchema(BaseModel):
customer_domain: str
project_description: str
mcp = FastMCP("my MCP Server")
name = "Marketing Crew"
description = "A crew that creates marketing posts"
input_schema = InputSchema
tool = crewai_adapter(
crewai_class=MarketingPostsCrew,
name=name,
description=description,
input_schema=input_schema,
)
mcp.add_tool(
tool,
name=name,
description=description,
)
if __name__ == "__main__":
serve_stdio(mcp) # Launch the MCP server
5. Configure your .env
file
Add any required environmental variables:
OPENAI_API_KEY=<your_openai_api_key>
SERPER_API_KEY=<your_serper_api_key>
6. Start the Server(s)
Using STDIO:
uv run serve_stdio
Using SSE:
uv run serve_sse
7. Testing and Integration
Cursor
Here is an example configuration of mcp.json
that runs the MCP server using STDIO:
{
"mcpServers": {
"Marketing Crew": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/K-Mistele/example-mcp serve_stdio"
],
"env": {
"OPENAI_API_KEY": "...",
"SERPER_API_KEY": "..."
}
}
}
}
With SSE:
{
"mcpServers": {
"Marketing Crew": {
"url": "http://localhost:8000/sse"
}
}
}
8. Publish
Coming soon!
Install from source
If you are a developer contributing to AutoMCP, you will want to install from source using:
git clone https://github.com/NapthaAI/auto-mcp.git
cd auto-mcp
uv venv
source .venv/bin/activate
uv pip install .
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.