
MCP OpenAPI Server
A Model Context Protocol server that loads multiple OpenAPI specifications and exposes them to LLM-powered IDE integrations, enabling AI to understand and work with your APIs directly in development tools like Cursor.
Tools
refresh-api-catalog
Refresh the API catalog
get-api-catalog
Get the API catalog, the catalog contains metadata about all openapi specifications, their operations and schemas
search-api-operations
Search for operations across specifications
search-api-schemas
Search for schemas across specifications
load-api-operation-by-operationId
Load an operation by operationId
load-api-operation-by-path-and-method
Load an operation by path and method
load-api-schema-by-schemaName
Load a schema by schemaName
README
@reapi/mcp-openapi
A Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications to enable LLM-powered IDE integrations. This server acts as a bridge between your OpenAPI specifications and LLM-powered development tools like Cursor and other code editors.
Features
- Loads multiple OpenAPI specifications from a directory
- Exposes API operations and schemas through MCP protocol
- Enables LLMs to understand and work with your APIs directly in your IDE
- Supports dereferenced schemas for complete API context
- Maintains a catalog of all available APIs
Powered by ReAPI
This open-source MCP server is sponsored by ReAPI, a next-generation API platform that simplifies API design and testing. While this server provides local OpenAPI integration for development, ReAPI offers two powerful modules:
🎨 API CMS
- Design APIs using an intuitive no-code editor
- Generate and publish OpenAPI specifications automatically
- Collaborate with team members in real-time
- Version control and change management
🧪 API Testing
- The most developer-friendly no-code API testing solution
- Create and manage test cases with an intuitive interface
- Powerful assertion and validation capabilities
- Serverless cloud test executor
- Perfect for both QA teams and developers
- CI/CD integration ready
Try ReAPI for free at reapi.com and experience the future of API development.
Cursor Configuration
To integrate the MCP OpenAPI server with Cursor IDE, you have two options for configuration locations:
Option 1: Project-specific Configuration (Recommended)
Create a .cursor/mcp.json
file in your project directory. This option is recommended as it allows you to maintain different sets of specs for different projects
{
"mcpServers": {
"@reapi/mcp-openapi": {
"command": "npx",
"args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "./specs"],
"env": {}
}
}
}
Tip: Using a relative path like
./specs
makes the configuration portable and easier to share across team members.Note: We recommend using
@latest
tag as we frequently update the server with new features and improvements.Important: Project-specific configuration helps manage LLM context limits. When all specifications are placed in a single folder, the combined metadata could exceed the LLM's context window, leading to errors. Organizing specs by project keeps the context size manageable.
Option 2: Global Configuration
Create or edit ~/.cursor/mcp.json
in your home directory to make the server available across all projects:
{
"mcpServers": {
"@reapi/mcp-openapi": {
"command": "npx",
"args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "/path/to/your/specs"],
"env": {}
}
}
}
Enable in Cursor Settings
After adding the configuration:
- Open Cursor IDE
- Go to Settings > Cursor Settings > MCP
- Enable the @reapi/mcp-openapi server
- Click the refresh icon next to the server to apply changes
Note: By default, Cursor requires confirmation for each MCP tool execution. If you want to allow automatic execution without confirmation, you can enable Yolo mode in Cursor settings.
The server is now ready to use. When you add new OpenAPI specifications to your directory, you can refresh the catalog by:
- Opening Cursor's chat panel
- Typing one of these prompts:
"Please refresh the API catalog" "Reload the OpenAPI specifications"
OpenAPI Specification Requirements
-
Place your OpenAPI 3.x specifications in the target directory:
- Supports both JSON and YAML formats
- Files should have
.json
,.yaml
, or.yml
extensions - Scanner will automatically discover and process all specification files
-
Specification ID Configuration:
- By default, the filename (without extension) is used as the specification ID
- To specify a custom ID, add
x-spec-id
in the OpenAPI info object:
openapi: 3.0.0 info: title: My API version: 1.0.0 x-spec-id: my-custom-api-id # Custom specification ID
Important: Setting a custom
x-spec-id
is crucial when working with multiple specifications that have:- Similar or identical endpoint paths
- Same schema names
- Overlapping operation IDs
The spec ID helps distinguish between these similar resources and prevents naming conflicts. For example:
# user-service.yaml info: x-spec-id: user-service paths: /users: get: ... # admin-service.yaml info: x-spec-id: admin-service paths: /users: get: ...
Now you can reference these endpoints specifically as
user-service/users
andadmin-service/users
How It Works
- The server scans the specified directory for OpenAPI specification files
- It processes and dereferences the specifications for complete context
- Creates and maintains a catalog of all API operations and schemas
- Exposes this information through the MCP protocol
- IDE integrations can then use this information to:
- Provide API context to LLMs
- Enable intelligent code completion
- Assist in API integration
- Generate API-aware code snippets
Tools
-
refresh-api-catalog
- Refresh the API catalog
- Returns: Success message when catalog is refreshed
-
get-api-catalog
- Get the API catalog, the catalog contains metadata about all openapi specifications, their operations and schemas
- Returns: Complete API catalog with all specifications, operations, and schemas
-
search-api-operations
- Search for operations across specifications
- Inputs:
query
(string): Search queryspecId
(optional string): Specific API specification ID to search within
- Returns: Matching operations from the API catalog
-
search-api-schemas
- Search for schemas across specifications
- Inputs:
query
(string): Search queryspecId
(optional string): Specific API specification ID to search
- Returns: Matching schemas from the API catalog
-
load-api-operation-by-operationId
- Load an operation by operationId
- Inputs:
specId
(string): API specification IDoperationId
(string): Operation ID to load
- Returns: Complete operation details
-
load-api-operation-by-path-and-method
- Load an operation by path and method
- Inputs:
specId
(string): API specification IDpath
(string): API endpoint pathmethod
(string): HTTP method
- Returns: Complete operation details
-
load-api-schema-by-schemaName
- Load a schema by schemaName
- Inputs:
specId
(string): API specification IDschemaName
(string): Name of the schema to load
- Returns: Complete schema details
Roadmap
-
Semantic Search
- Enable natural language queries for API operations and schemas
- Improve search accuracy with semantic understanding
-
Remote Specs Sync
- Support syncing OpenAPI specifications from remote sources
-
Code Templates
- Expose code templates through MCP protocol
- Provide reference patterns for LLM code generation
-
Community Contributions
- Submit feature requests and bug reports
- Contribute to improve the server
Example Prompts in Cursor
Here are some example prompts you can use in Cursor IDE to interact with your APIs:
-
Explore Available APIs
"Show me all available APIs in the catalog with their operations" "List all API specifications and their endpoints"
-
API Operation Details
"Show me the details of the create pet API endpoint" "What are the required parameters for creating a new pet?" "Explain the response schema for the pet creation endpoint"
-
Schema and Mock Data
"Generate mock data for the Pet schema" "Create a valid request payload for the create pet endpoint" "Show me examples of valid pet objects based on the schema"
-
Code Generation
"Generate an Axios client for the create pet API" "Create a TypeScript interface for the Pet schema" "Write a React hook that calls the create pet endpoint"
-
API Integration Assistance
"Help me implement error handling for the pet API endpoints" "Generate unit tests for the pet API client" "Create a service class that encapsulates all pet-related API calls"
-
Documentation and Usage
"Show me example usage of the pet API with curl" "Generate JSDoc comments for the pet API client methods" "Create a README section explaining the pet API integration"
-
Validation and Types
"Generate Zod validation schema for the Pet model" "Create TypeScript types for all pet-related API responses" "Help me implement request payload validation for the pet endpoints"
-
API Search and Discovery
"Find all endpoints related to pet management" "Show me all APIs that accept file uploads" "List all endpoints that return paginated responses"
These prompts demonstrate how to leverage the MCP server's capabilities for API development. Feel free to adapt them to your specific needs or combine them for more complex tasks.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.