OpenAPI to MCP
A standalone proxy that transforms any OpenAPI or Swagger-described REST API into an MCP server by mapping API operations to executable MCP tools. It enables AI clients to interact with existing web services through automated HTTP requests based on their official documentation.
README
OpenAPI to MCP
Standalone proxy that turns any OpenAPI/Swagger-described HTTP API into an MCP (Model Context Protocol) server. It loads the spec at startup, filters operations by include/exclude, and registers one MCP tool per API operation. Tool calls are executed as HTTP requests to the backend API.
Useful when you already have (or want) a REST API with an OpenAPI/Swagger spec: the same spec drives both human-readable API docs and MCP tools for AI clients.
How it works
flowchart LR
subgraph startup["Startup"]
A[OpenAPI spec<br/>URL or file] --> B[Load and filter<br/>include or exclude]
B --> C[N MCP tools<br/>one per operation]
end
subgraph runtime["Runtime"]
D[MCP client] <-->|Streamable HTTP<br/>POST/GET /mcp| E[openapi-to-mcp]
E <-->|HTTP| F[Backend API]
end
C -.->|registered in| E
- Load OpenAPI spec from
MCP_OPENAPI_SPEC_URL(preferred) orMCP_OPENAPI_SPEC_FILE. - Collect operations (method + path). Filter: if
MCP_INCLUDE_ENDPOINTSis set, keep only those; otherwise drop any inMCP_EXCLUDE_ENDPOINTS. Include has priority over exclude. - For each operation create an MCP tool: name =
MCP_TOOL_PREFIX+ path segment (e.g.api_+messages=api_messages). If the same path segment is used by more than one method (e.g. GET and PUT on/pet/{id}), the tool name is made unique by appending the method (e.g.pet_get,pet_put). Input schema from parameters and requestBody (Zod), handler = HTTP call toMCP_API_BASE_URL.
Transport: Streamable HTTP. Endpoint: POST /mcp and GET /mcp.
Environment variables (MCP_ prefix)
| Variable | Description | Default |
|---|---|---|
MCP_API_BASE_URL |
Base URL for API requests | http://127.0.0.1:3000 |
MCP_OPENAPI_SPEC_URL |
URL of OpenAPI spec (e.g. http://api:3000/openapi.json). Takes precedence over file. |
- |
MCP_OPENAPI_SPEC_FILE |
Path to OpenAPI JSON file (used if URL not set) | - |
MCP_INCLUDE_ENDPOINTS |
Comma-separated method:path (e.g. get:/messages,get:/channels). If set, only these become tools. |
- |
MCP_EXCLUDE_ENDPOINTS |
Comma-separated method:path to exclude. Ignored for endpoints in include. |
- |
MCP_TOOL_PREFIX |
Prefix for tool names (e.g. api_ -> api_messages, api_channels) |
(empty) |
MCP_SERVER_NAME |
Server name reported to MCP clients | openapi-to-mcp |
MCP_PORT |
Port for Streamable HTTP server | 3100 |
MCP_HOST |
Bind host | 0.0.0.0 |
At least one of MCP_OPENAPI_SPEC_URL or MCP_OPENAPI_SPEC_FILE must be set.
Run with npm (local)
-
Copy
.env.exampleto.envand set at least the OpenAPI spec source and API base URL:cp .env.example .env # Edit .env: MCP_OPENAPI_SPEC_URL or MCP_OPENAPI_SPEC_FILE, MCP_API_BASE_URL -
Install, build, and start:
npm ci npm run build npm run start -
The server listens on
http://<MCP_HOST>:<MCP_PORT>(defaulthttp://0.0.0.0:3100). Connect MCP clients to POST/GEThttp://localhost:3100/mcp(Streamable HTTP).
Ensure the backend API is reachable at MCP_API_BASE_URL and that the OpenAPI spec URL (or file) returns a valid OpenAPI 3.x JSON.
Run with Docker
Image on Docker Hub: evilfreelancer/openapi-to-mcp. Use tag latest or a version tag (e.g. v1.0.0).
-
Pull and run with env vars (example: spec from URL, API at host):
docker run --rm -p 3100:3100 \ -e MCP_OPENAPI_SPEC_URL=http://host.docker.internal:3000/openapi.json \ -e MCP_API_BASE_URL=http://host.docker.internal:3000 \ evilfreelancer/openapi-to-mcp:latestOn Linux you may need
--add-host=host.docker.internal:host-gatewayor use the host network. Alternatively pass a file path and mount the spec:docker run --rm -p 3100:3100 \ -v $(pwd)/openapi.json:/app/openapi.json:ro \ -e MCP_OPENAPI_SPEC_FILE=/app/openapi.json \ -e MCP_API_BASE_URL=http://host.docker.internal:3000 \ evilfreelancer/openapi-to-mcp:latestTo build the image locally instead:
docker build -t openapi-to-mcp .and useopenapi-to-mcpas the image name in the commands above.
Run with Docker Compose
A minimal docker-compose.yaml is included so you can run the MCP server and optionally point it at an existing API. It uses the image from Docker Hub (evilfreelancer/openapi-to-mcp).
-
Copy
.env.exampleto.envand set:MCP_OPENAPI_SPEC_URL(e.g. your API’s/openapi.jsonURL)MCP_API_BASE_URL(e.g.http://api:3000if the API runs in another container)
-
From the project root:
docker compose up -d -
The MCP server will be available at
http://localhost:3100/mcp(Streamable HTTP).
To use a local OpenAPI file instead of a URL, set MCP_OPENAPI_SPEC_FILE and mount the file into the container (see docker-compose.yaml comments if present).
Tests
npm test
Tests cover: config (env vars, include/exclude, defaults), OpenAPI loader (URL and file, URL over file, error when both unset), and openapi-to-tools (filtering, prefix, handler calling API with success and error). HTTP is mocked (axios-mock-adapter).
Dockerfile
The project includes a Dockerfile (Node 20 Alpine): install deps, build TypeScript, production prune, run node dist/index.js. No dev dependencies or tests in the image. Pre-built images are published to Docker Hub. To build locally:
docker build -t openapi-to-mcp .
CI - Docker image on Docker Hub
A GitHub Actions workflow (.github/workflows/docker-publish.yml) runs tests, then builds the image and pushes it to Docker Hub.
- Triggers: manually (Actions → "Docker build and push" → Run workflow) or on push of any git tag.
- Version: on tag push the image tag equals the git tag (e.g.
v1.0.0); on manual run you can set a version (defaultlatest). - Main only: when triggered by a tag, the workflow checks that the tag points to a commit on
main; otherwise the run fails.
Required repository secrets (Settings → Secrets and variables → Actions):
| Secret | Description |
|---|---|
DOCKERHUB_USERNAME |
Docker Hub username (image will be DOCKERHUB_USERNAME/openapi-to-mcp) |
DOCKERHUB_TOKEN |
Docker Hub access token (recommended) or password |
Similar projects
- mcp-openapi-proxy (Python) – MCP server that exposes REST APIs from OpenAPI specs as MCP tools. Low-level mode (one tool per endpoint) or FastMCP mode. Auth and endpoint filtering. Install:
uvx mcp-openapi-proxy. - openapi-mcp-proxy (TypeScript) – CLI that turns an OpenAPI service into an MCP server; middleware between OpenAPI and MCP clients.
- openapi-mcp-generator (TypeScript) – Generates a full MCP server project from OpenAPI 3.0+ (stdio, SSE, Streamable HTTP), with Zod validation and auth. Install:
npm install -g openapi-mcp-generator. - FastMCP + OpenAPI (Python) – OpenAPI integration for FastMCP: auth, route mapping, parameter handling.
- openapi-mcp-codegen – Code generator from OpenAPI to MCP server (Apache 2.0).
- Swagger MCP (Vizioz) – AI-driven MCP server generation from Swagger/OpenAPI; stores specs locally.
- liblab – Cloud service: generate and deploy MCP server from OpenAPI or Postman collection.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.