FastMCP OpenAPI
Dynamically generates MCP tools from OpenAPI specifications, enabling AI assistants to interact with any REST API through natural language. Supports multiple APIs with authentication, parameter validation, and integration with Claude Desktop and LangChain.
README
FastMCP OpenAPI
A FastMCP wrapper that dynamically generates MCP (Model Context Protocol) tools from OpenAPI specifications.
Quick Start
Prerequisites
- Python 3.8+ with pip
- Node.js 16+ (for MCP Inspector)
- OpenAI API key (for LangChain demos)
Installation
pip install fastmcp-openapi
Basic Usage
# Generate MCP tools from any OpenAPI spec
fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json
# With authentication
fastmcp-openapi --spec https://api.example.com/openapi.json --auth-header "Bearer your-token"
# Multiple APIs
fastmcp-openapi --spec api1.json --spec api2.json --spec api3.json
Test with MCP Inspector
# Install MCP Inspector
npm install -g @modelcontextprotocol/inspector
# Test your OpenAPI tools
npx @modelcontextprotocol/inspector fastmcp-openapi --spec examples/simple_api.json
# Test Petstore API with Inspector
npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2
Claude Desktop Integration
Add to your Claude Desktop config:
{
"mcpServers": {
"openapi-server": {
"command": "fastmcp-openapi",
"args": ["--spec", "https://api.example.com/openapi.json", "--auth-header", "Bearer your-token"]
}
}
}
Features
- ✅ Dynamic Tool Generation: Converts OpenAPI operations to MCP tools automatically
- ✅ Type Safety: Full parameter validation using OpenAPI schemas
- ✅ Authentication: Bearer tokens, API keys, Basic auth
- ✅ Multiple APIs: Load multiple OpenAPI specs in one server
- ✅ Real-time: Add/remove APIs without restart
Command Line Options
fastmcp-openapi --help
Options:
--spec TEXT OpenAPI specification URL or file path (can be used multiple times)
--name TEXT Server name (default: "OpenAPI Server")
--auth-header TEXT Authorization header (e.g., 'Bearer token123'). Must match order of --spec options.
--base-url TEXT Override base URL for API calls. Must match order of --spec options.
--config TEXT JSON config file with API specifications
--transport TEXT Transport: stdio, streamable-http, sse (default: stdio)
--port INTEGER Port for HTTP/SSE transport (default: 8000)
--debug Enable debug logging
Programmatic Usage
from fastmcp_openapi import OpenAPIServer
# Create server
server = OpenAPIServer("My API Server")
# Add OpenAPI specs
await server.add_openapi_spec(
name="petstore",
spec_url="https://petstore.swagger.io/v2/swagger.json",
auth_header="Bearer your-token"
)
# Run server
server.run()
Examples
Multiple APIs with Different Auth
# Multiple APIs with different base URLs and auth
fastmcp-openapi \
--spec https://petstore.swagger.io/v2/swagger.json \
--spec https://api.github.com/openapi.yaml \
--spec ./local-api.json \
--base-url https://petstore.swagger.io/v2 \
--base-url https://api.github.com \
--base-url http://localhost:3000 \
--auth-header "Bearer petstore-token" \
--auth-header "Bearer github-token" \
--auth-header "Basic local-auth"
# Each API gets its own tools with prefixes:
# - api_1_getPetById (Petstore)
# - api_2_getUser (GitHub)
# - api_3_createItem (Local API)
Mixed API Sources
# Combine remote and local APIs
fastmcp-openapi \
--spec https://petstore.swagger.io/v2/swagger.json \
--spec examples/simple_api.json \
--spec https://jsonplaceholder.typicode.com/openapi.json \
--base-url https://petstore.swagger.io/v2 \
--base-url http://localhost:8080 \
--base-url https://jsonplaceholder.typicode.com
# Creates unified MCP server with tools from all APIs
Authenticated API
fastmcp-openapi \
--spec https://api.example.com/openapi.json \
--auth-header "Bearer your-oauth-token" \
--base-url "https://api.example.com/v1"
Development Mode
# HTTP mode for web testing
fastmcp-openapi \
--spec examples/simple_api.json \
--transport streamable-http \
--port 8080 \
--debug
# SSE mode for MCP Inspector
fastmcp-openapi \
--spec https://petstore.swagger.io/v2/swagger.json \
--base-url https://petstore.swagger.io/v2 \
--transport sse \
--port 8081 \
--debug
LangChain Integration
# Install required dependencies
pip install langchain-openai langchain-mcp-adapters langgraph
# Set OpenAI API key
export OPENAI_API_KEY="your-openai-api-key"
# Start FastMCP server with HTTP transport
fastmcp-openapi \
--spec https://petstore.swagger.io/v2/swagger.json \
--base-url https://petstore.swagger.io/v2 \
--transport streamable-http \
--port 8081
# Run LangChain test (in another terminal)
python test_mcp_langchain.py
The LangChain integration allows AI agents to use the generated MCP tools for natural language interaction with APIs.
How It Works
- Load OpenAPI Spec: Fetches and parses OpenAPI/Swagger specifications
- Generate Tools: Creates MCP tools for each API operation with proper schemas
- Handle Requests: Validates parameters and makes authenticated HTTP requests
- Return Results: Formats API responses for AI consumption
Supported Features
- ✅ OpenAPI 3.0.x, 3.1.x, Swagger 2.0
- ✅ Path/query parameters, headers, request bodies
- ✅ Authentication (Bearer, API Key, Basic)
- ✅ Parameter validation and type checking
- ✅ Multiple APIs in one server
- ✅ Multiple transports: stdio, streamable-http, sse
- ✅ LangChain integration for AI agents
- ✅ MCP Inspector support for interactive testing
Testing & Examples
Quick Test with Petstore API
# 1. Start server with SSE transport
fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2 --transport sse --port 8081
# 2. Test with MCP Inspector (in another terminal)
npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2
# 3. Test with LangChain (requires OPENAI_API_KEY)
python test_mcp_langchain.py
Available Transport Modes
stdio: Standard input/output (default, for Claude Desktop)streamable-http: HTTP-based transport (for LangChain integration)sse: Server-Sent Events transport (for MCP Inspector)
Multiple API Management
FastMCP OpenAPI supports combining multiple OpenAPI specifications into a single MCP server, each with their own base URLs and authentication.
Configuration Methods
Method 1: JSON Configuration (Recommended)
Create a JSON config file to clearly define each API:
{
"apis": [
{
"name": "petstore",
"spec": "https://petstore.swagger.io/v2/swagger.json",
"base_url": "https://petstore.swagger.io/v2",
"auth": "Bearer petstore-api-key"
},
{
"name": "simple_api",
"spec": "examples/simple_api.json",
"base_url": "http://localhost:8080"
}
]
}
# Use the config file
fastmcp-openapi --config examples/multi_api_config.json --transport sse --port 8081
Method 2: Command Line Arguments (Positional Matching)
⚠️ Important: Arguments must be in the same order - each --base-url and --auth-header matches the corresponding --spec by position.
# Order matters: spec[0]→base_url[0]→auth[0], spec[1]→base_url[1]→auth[1], etc.
fastmcp-openapi \
--spec https://petstore.swagger.io/v2/swagger.json \ # Position 0
--spec examples/simple_api.json \ # Position 1
--spec https://api.github.com/openapi.yaml \ # Position 2
--base-url https://petstore.swagger.io/v2 \ # Position 0 → spec[0]
--base-url http://localhost:8080 \ # Position 1 → spec[1]
--base-url https://api.github.com \ # Position 2 → spec[2]
--auth-header "Bearer petstore-key" \ # Position 0 → spec[0]
--auth-header "" \ # Position 1 → spec[1] (no auth)
--auth-header "Bearer github-key" # Position 2 → spec[2]
Benefits
- Unified Interface: Access multiple APIs through one MCP server
- Individual Configuration: Each API can have its own base URL and auth
- Tool Namespacing: Tools are automatically prefixed to avoid conflicts
- Mixed Sources: Combine remote APIs, local services, and files
Tool Naming Convention
When multiple APIs are loaded, tools are automatically prefixed:
- Single API:
operationId→getPetById - Multiple APIs:
api_name_operationId→petstore_getPetById,github_getUser
Development
git clone <repository>
cd fastmcp-openapi
pip install -e ".[dev]"
pytest
License
MIT License
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.