MCPify
Automatically generates and runs MCP servers by scraping API documentation and using Gemini to extract endpoints and configuration. It enables users to interact with any REST API through MCP-compatible clients by simply providing a documentation URL.
README
MCPify
Point it at API docs, get an MCP server.
MCPify scrapes API documentation, uses Gemini to figure out the endpoints/auth/params, and spits out a config file. The runtime reads that config and runs an MCP server that Claude (or any MCP client) can use to call the API.
Installation
git clone https://github.com/yourusername/mcpify.git
cd mcpify
pip install -e .
Needs Python 3.11+ and a Gemini API key.
Usage
export GEMINI_API_KEY="your-api-key"
# Parse docs into a config file
mcpify parse https://api.example.com/docs -o my-api.json
# Run the server
mcpify serve my-api.json --auth "your-api-token"
# Or do both at once
mcpify quickstart https://api.example.com/docs --auth "token"
To use with Claude Desktop, add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"my-api": {
"command": "mcpify",
"args": ["serve", "/path/to/my-api.json", "--auth", "your-token"]
}
}
}
CLI
mcpify parse <url> - Scrape docs and generate config
mcpify parse https://docs.github.com/en/rest -o github.json --max-pages 15
| Option | Description |
|---|---|
-o, --output |
Output file (default: mcpify-config.json) |
-m, --max-pages |
Max pages to scrape (default: 10) |
--no-follow |
Don't follow links |
-k, --api-key |
Gemini API key (or use env var) |
--model |
Gemini model (default: gemini-2.0-flash) |
mcpify serve <config> - Run MCP server
mcpify serve my-api.json --auth "Bearer token" --transport stdio
| Option | Description |
|---|---|
-a, --auth |
Auth token for API calls |
-t, --transport |
stdio, sse, or http (default: stdio) |
mcpify show <config> - Print config as table
mcpify quickstart <url> - Parse and serve in one shot
Config Format
The generated JSON looks like this:
{
"name": "my-api",
"description": "Description of the API",
"base_url": "https://api.example.com/v1",
"version": "1.0.0",
"auth": {
"type": "bearer",
"header_name": "Authorization",
"prefix": "Bearer "
},
"tools": [
{
"name": "get_users",
"description": "Retrieve a list of users",
"method": "GET",
"path": "/users",
"parameters": [
{
"name": "limit",
"type": "integer",
"description": "Max results to return",
"required": false,
"location": "query",
"default": 10
}
],
"response": {
"description": "Array of user objects"
},
"tags": ["users"]
}
]
}
Auth types: none, api_key, bearer, oauth2
Parameter locations: query, path, header, body
Python API
from mcpify.scraper import scrape_documentation
from mcpify.parser import parse_documentation
from mcpify.runtime import create_mcp_server
# Scrape docs
docs = await scrape_documentation("https://api.example.com/docs")
# Parse with Gemini
config = await parse_documentation(docs)
# Save config
config_path = "my-api.json"
with open(config_path, "w") as f:
f.write(config.to_json())
# Create and run server
server = create_mcp_server(config, auth_token="your-token")
server.run()
Examples
# GitHub
mcpify parse https://docs.github.com/en/rest/users -o github.json
mcpify serve github.json --auth "ghp_your_token"
# Stripe
mcpify parse https://stripe.com/docs/api -o stripe.json
mcpify serve stripe.json --auth "sk_test_your_key"
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.