MCP Adapter
Automatically converts OpenAPI specifications into Model Context Protocol applications, enabling HTTP APIs to be managed as MCP services. It features a dynamic architecture that monitors file systems or Kubernetes ConfigMaps to update MCP tools in real-time.
README
MCP Adapter
Introduction
MCP Adapter is a tool designed to automatically convert OpenAPI specifications (v2/v3) into MCP (Model Context Protocol) applications. It enables seamless transformation of HTTP APIs into MCP APIs, allowing legacy or new HTTP services to be exposed and managed via the MCP protocol with minimal manual intervention.
Thoughts and Architecture
The project is built around an event-driven, decoupled architecture. The core workflow is:
- Resource Watcher: Monitors file changes or Kubernetes ConfigMaps for OpenAPI specs.
- OpenAPI Loader: Parses and validates OpenAPI documents, extracts API routes, and prepares them for MCP conversion.
- MCP Server: Dynamically creates and manages MCP servers and tools based on the parsed OpenAPI specs.
- HTTP Server: Provides health checks and introspection endpoints.
Each stage communicates via asynchronous channels (queues), ensuring loose coupling and scalability. The system supports both file-based and Kubernetes-based resource watching, making it flexible for different deployment scenarios.
High-level flow:
Resource Watcher (File or Kubernetes)
│
└─(channel 1: watcher → openapi)─▶ OpenAPILoader (parses OpenAPI spec, builds HTTP client definitions)
│
└─(channel 2: openapi → server)─▶ MCPServer (manages MCP instances and tools)
│
└─▶ MCPInstance / FastMCP (runs the actual MCP protocol server)
Key Design Points:
- Clear module boundaries: Each responsibility (watching, parsing, serving) is isolated for maintainability.
- Event-driven with channels: Asynchronous message passing decouples components.
- Extensible: Easily add new resource sources or protocols.
- Graceful shutdown: Listens for SIGINT/SIGTERM and cleans up all tasks.
Dependencies
- Python >= 3.14 (recommend adjusting to 3.11/3.12 for broader compatibility)
- anyio — async concurrency
- argparse — CLI parsing
- fastmcp — MCP server framework
- httpx — async HTTP client
- kopf — Kubernetes operator framework
- openapi-spec-validator — OpenAPI validation
- prance — OpenAPI parsing and conversion
TODO
- Refactor CLI parsing: Move CLI argument parsing out of module top-level to avoid side effects on import.
- Unify async runtime: Standardize on either anyio or asyncio for all concurrency primitives.
- Graceful server lifecycle: Ensure HTTP and MCP servers can be started and stopped cleanly.
- Define strict message types: Use dataclasses for channel messages to improve type safety.
- Complete OpenAPI loader logic: Finalize route extraction, diffing, and error handling.
- Enhance error handling and retries: Add robust exception management and retry strategies.
- Lower Python version requirement: Update
pyproject.tomlfor compatibility with mainstream Python versions. - Add unit tests and CI: Cover core logic with automated tests and continuous integration.
- Improve configuration management: Consider using Pydantic's
BaseSettingsfor unified config via env/CLI/file. - Add structured logging and metrics: Support JSON logs and Prometheus metrics for observability.
For more details, see the source code and sample OpenAPI specs.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.