Discover Awesome MCP Servers

Extend your agent with 29,072 capabilities via MCP servers.

All29,072
hwpConverMdMCP

hwpConverMdMCP

An MCP server that enables LLMs to convert HWP and HWPX documents into Markdown for analysis and processing. It supports document conversion via local file paths or Base64 content across various MCP-compatible clients.

a2db

a2db

Most database MCP servers make you run one query at a time, repeat connection details on every call, and return results double-encoded inside JSON strings. a2db fixes all of that: Pre-configured connections — define databases in .mcp.json with --register, agent queries immediately Batch queries — run multiple named queries in a single tool call

Apple MCP Server

Apple MCP Server

Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.

GitLab MCP Server

GitLab MCP Server

Connects AI assistants to GitLab, enabling natural language queries for merge requests, code reviews, pipeline tests, job logs, and commit discussions directly from chat.

Greenhouse MCP Server by CData

Greenhouse MCP Server by CData

This project builds a read-only MCP server. For full read, write, update, delete, and action capabilities and a simplified setup, check out our free CData MCP Server for Greenhouse (beta): https://www.cdata.com/download/download.aspx?sku=PGZK-V&type=beta

MCP Creator Growth

MCP Creator Growth

An interactive learning assistant that helps developers understand AI-generated code changes through quizzes and blocking learning sessions. It tracks and searches debugging experiences using RAG to ensure users build long-term technical understanding rather than just copy-pasting solutions.

Pangea MCP Server

Pangea MCP Server

A Model Context Protocol server that provides Claude with access to Pangea's security services, including AI Guard, Domain Intel, Embargo checks, IP Intelligence, Redaction, Secure Audit Log, URL Intelligence, and Vault services.

Levelang MCP Server

Levelang MCP Server

Integrates the levelang.app translation API with AI assistants to provide translations constrained to specific learner proficiency levels. It supports multiple languages and allows users to control translation styles and moods while dynamically discovering available language configurations.

中国城市天气查询 MCP 服务

中国城市天气查询 MCP 服务

Flexible Key-Value Extracting MCP Server

Flexible Key-Value Extracting MCP Server

Extracts structured key-value pairs from arbitrary, noisy, or unstructured text using LLMs and provides output in multiple formats (JSON, YAML, TOML) with type safety.

MCP Documentation Server

MCP Documentation Server

A server that provides organized documentation content for various applications using the Model Context Protocol, enabling AI assistants to access quickstart guides and code examples.

mcp-mac

mcp-mac

Enables AI agents to interact with macOS applications (Finder, Mail, Contacts, Reminders, Notes, Calendar, TextEdit) using AppleScript. Allows AI assistants to perform tasks like searching contacts, managing files, checking email, and creating reminders through natural language.

Weekly Weather

Weekly Weather

Weather forecast server which returns 7 days of detailed weather anywhere in the world, using the OpenWeatherMap One Call API 3.0.

OpenCode MCP Tool

OpenCode MCP Tool

Enables AI assistants to interact with multiple AI models through the OpenCode CLI with a unified interface. Supports plan mode for structured analysis, flexible model selection, and natural language queries with file references.

steampipe-mcp

steampipe-mcp

steampipe-mcp

npm-mcp

npm-mcp

MCP server for npm package management — publish, install, audit, search, security & dependency health

Cisco MCP Pods Server

Cisco MCP Pods Server

Enables AI agents to interact with Cisco API Gateway pods endpoints for complete pod management including CRUD operations, credential updates, and configuration management through natural language. Supports multiple deployment modes including local Claude Desktop integration and cloud deployment for remote AI agents like Webex Connect.

Slack MCP Server

Slack MCP Server

A Model Context Protocol server that enables LLMs to interact with Slack workspaces through OAuth 2.0 authentication. It provides tools for listing channels and posting messages while supporting secure token persistence and dynamic client registration.

Customs Big Data MCP Server

Customs Big Data MCP Server

Provides comprehensive import/export trade data queries including export trends, product category statistics, order geographic distribution, and overseas certification information to help users understand enterprises' international trade situations.

Reddit Buddy MCP

Reddit Buddy MCP

Enables AI assistants to browse Reddit, search posts, analyze user activity, and fetch comments without requiring API keys. Features smart caching, clean data responses, and optional authentication for higher rate limits.

Emcee

Emcee

I understand you're looking for a way to automatically generate an **Mock Control Plane (MCP) server** from an OpenAPI (formerly Swagger) specification. This is a great idea for testing, development, and prototyping! Here's a breakdown of how you can achieve this, along with code examples and explanations. I'll focus on using Python and the `connexion` library, as it's a popular and effective choice for this task. **Core Concepts** * **OpenAPI Specification:** This is the contract that defines your API. It describes the endpoints, request/response formats, data types, and more. You'll need a valid OpenAPI YAML or JSON file. * **Mock Control Plane (MCP) Server:** A lightweight server that simulates the behavior of your real API. It returns pre-defined responses based on the OpenAPI specification, allowing you to test clients and integrations without needing the actual backend. * **`connexion` Library (Python):** A framework that simplifies building REST APIs from OpenAPI specifications. It handles request routing, validation, and serialization/deserialization based on the OpenAPI definition. **Steps to Generate an MCP Server** 1. **Install `connexion`:** ```bash pip install connexion ``` 2. **Create a Python File (e.g., `mcp_server.py`):** ```python import connexion import logging import os # Configure logging (optional, but recommended) logging.basicConfig(level=logging.INFO) # Path to your OpenAPI specification file OPENAPI_SPEC_PATH = 'openapi.yaml' # Replace with your file path # Function to handle requests (mock implementation) def handle_request(operationId, **kwargs): """ This function is called when an API endpoint is hit. It should return a mock response based on the OpenAPI spec. """ logging.info(f"Handling request for operationId: {operationId}") logging.info(f"Request parameters: {kwargs}") # Implement your mock logic here. This is the KEY part. # You'll need to inspect the 'operationId' and 'kwargs' # to determine what response to return. # Example: Return a default response for a specific operationId if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': 'Mock User', 'email': f'user{user_id}@example.com'} else: return {'message': 'User ID required'}, 400 # Bad Request # Example: Return a different response based on request parameters if operationId == 'createUser': request_body = kwargs.get('body') # Access the request body if request_body and 'name' in request_body and 'email' in request_body: return {'id': 123, 'name': request_body['name'], 'email': request_body['email']}, 201 # Created else: return {'message': 'Invalid request body'}, 400 # Default response if no specific logic is found return {'message': 'Mock response for ' + operationId}, 200 def main(): # Create a Connexion app app = connexion.App(__name__, specification_dir='./') # or specify the full path # Read the OpenAPI specification app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) # Register a default resolver to handle all operationIds # This is where the magic happens: All requests are routed to handle_request() app.resolver = connexion.Resolver(lambda operationId: handle_request) # Start the server port = int(os.environ.get("PORT", 8080)) # Use PORT environment variable if available app.run(port=port) if __name__ == '__main__': main() ``` 3. **Create an OpenAPI Specification File (e.g., `openapi.yaml`):** ```yaml openapi: 3.0.0 info: title: Mock API version: 1.0.0 paths: /users/{user_id}: get: summary: Get a user by ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Successful operation content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Bad Request content: application/json: schema: type: object properties: message: type: string /users: post: summary: Create a new user operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email responses: '201': description: User created successfully content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Invalid request content: application/json: schema: type: object properties: message: type: string ``` 4. **Run the Server:** ```bash python mcp_server.py ``` **Explanation** * **`connexion.App`:** Creates the WSGI application. * **`app.add_api(OPENAPI_SPEC_PATH)`:** Parses the OpenAPI specification and sets up the routes. `pythonic_params=True` converts parameter names to Python-style (e.g., `user-id` becomes `user_id`). * **`connexion.Resolver`:** This is the crucial part. It tells Connexion *how* to handle incoming requests. Instead of mapping each `operationId` to a specific function, we use a *default resolver*. The lambda function `lambda operationId: handle_request` means that *every* request, regardless of the `operationId`, will be routed to the `handle_request` function. * **`handle_request(operationId, **kwargs)`:** This function is the heart of your mock server. It receives the `operationId` (the unique identifier for the endpoint in the OpenAPI spec) and any request parameters (`kwargs`). *You* are responsible for implementing the logic to return appropriate mock responses based on this information. * **`kwargs`:** This dictionary contains all the request parameters: * Path parameters (e.g., `user_id` in `/users/{user_id}`) * Query parameters (e.g., `?page=1`) * Request body (accessible as `kwargs.get('body')` if the request has a body) * Headers (accessible via `request.headers` within the `handle_request` function if you need them; you'll need to import `flask` and use `from flask import request`) **Important Considerations and Improvements** * **Mock Data Generation:** For more realistic mock data, consider using libraries like `Faker` to generate random names, emails, addresses, etc. This will make your mock server more useful for testing. ```python from faker import Faker fake = Faker() def handle_request(operationId, **kwargs): if operationId == 'getUser': user_id = kwargs.get('user_id') return {'id': user_id, 'name': fake.name(), 'email': fake.email()} ``` * **Response Examples from OpenAPI:** The OpenAPI specification allows you to define example responses for each endpoint. You can use these examples to automatically generate mock responses. Libraries like `swagger_ui_bundle` can help with this. However, integrating this directly into the `connexion` resolver requires more advanced customization. * **Configuration:** Use environment variables or a configuration file to manage settings like the OpenAPI specification path, port number, and other options. * **Error Handling:** Implement proper error handling in your `handle_request` function. Return appropriate HTTP status codes (400, 404, 500, etc.) and error messages when necessary. * **Security:** If your API has security requirements (e.g., authentication), you'll need to implement mock security checks in your `handle_request` function. This might involve checking for specific headers or tokens and returning appropriate responses. * **Testing:** Write unit tests to verify that your mock server is returning the correct responses for different requests. * **Dynamic Responses:** For more complex scenarios, you might need to store some state in your mock server (e.g., a list of users). This will allow you to simulate more realistic behavior, such as creating new resources and updating existing ones. * **Alternative Libraries:** While `connexion` is a good choice, other libraries like `Flask` or `FastAPI` can also be used to build mock servers. The key is to parse the OpenAPI specification and use it to generate the routes and response logic. **Example with Faker and OpenAPI Example Responses (More Advanced)** This example shows how to use `Faker` for more realistic data and attempts to leverage example responses from the OpenAPI spec (though this requires more manual setup). ```python import connexion import logging import os from faker import Faker import yaml # For reading the OpenAPI spec logging.basicConfig(level=logging.INFO) fake = Faker() OPENAPI_SPEC_PATH = 'openapi.yaml' def load_openapi_examples(spec_path): """Loads example responses from the OpenAPI spec.""" with open(spec_path, 'r') as f: spec = yaml.safe_load(f) examples = {} for path, path_data in spec.get('paths', {}).items(): for method, method_data in path_data.items(): operation_id = method_data.get('operationId') if operation_id: examples[operation_id] = {} for status_code, response_data in method_data.get('responses', {}).items(): content = response_data.get('content') if content and 'application/json' in content: example = content['application/json'].get('example') if example: examples[operation_id][status_code] = example return examples openapi_examples = load_openapi_examples(OPENAPI_SPEC_PATH) def handle_request(operationId, **kwargs): logging.info(f"Handling request for operationId: {operationId}") logging.info(f"Request parameters: {kwargs}") # Try to use example response from OpenAPI spec if operationId in openapi_examples and '200' in openapi_examples[operationId]: logging.info(f"Using example response from OpenAPI for {operationId}") return openapi_examples[operationId]['200'], 200 # Return example and 200 OK if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': fake.name(), 'email': fake.email()} else: return {'message': 'User ID required'}, 400 if operationId == 'createUser': request_body = kwargs.get('body') if request_body and 'name' in request_body and 'email' in request_body: return {'id': fake.random_int(), 'name': request_body['name'], 'email': request_body['email']}, 201 else: return {'message': 'Invalid request body'}, 400 return {'message': 'Mock response for ' + operationId}, 200 def main(): app = connexion.App(__name__, specification_dir='./') app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) app.resolver = connexion.Resolver(lambda operationId: handle_request) port = int(os.environ.get("PORT", 8080)) app.run(port=port) if __name__ == '__main__': main() ``` **Key Improvements in the Advanced Example:** * **`load_openapi_examples`:** This function parses the OpenAPI specification and extracts the `example` values from the `responses` section. It stores them in a dictionary keyed by `operationId` and status code. * **Example Response Priority:** The `handle_request` function now first checks if there's an example response defined in the OpenAPI spec for the given `operationId`. If so, it returns that example. This allows you to easily define realistic mock responses in your OpenAPI file. * **Faker Integration:** If no example response is found, it falls back to using `Faker` to generate random data. **To use the advanced example, you'll need to add `example` fields to your OpenAPI specification:** ```yaml openapi: 3.0.0 info: title: Mock API version: 1.0.0 paths: /users/{user_id}: get: summary: Get a user by ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Successful operation content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Add an example response here id: 123 name: John Doe email: john.doe@example.com '400': description: Bad Request content: application/json: schema: type: object properties: message: type: string example: # Add an example error response here message: "Invalid user ID" /users: post: summary: Create a new user operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email example: # Add an example request body here name: Jane Smith email: jane.smith@example.com responses: '201': description: User created successfully content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Add an example response here id: 456 name: Jane Smith email: jane.smith@example.com '400': description: Invalid request content: application/json: schema: type: object properties: message: type: string example: # Add an example error response here message: "Name and email are required" ``` **In summary,** this approach provides a flexible and powerful way to generate MCP servers from OpenAPI specifications. By combining `connexion`, a default resolver, `Faker`, and example responses from your OpenAPI file, you can create realistic and useful mock APIs for testing and development. Remember to tailor the `handle_request` function to your specific API's needs. **Tradução para Português:** Eu entendo que você está procurando uma maneira de gerar automaticamente um **servidor Mock Control Plane (MCP)** a partir de uma especificação OpenAPI (anteriormente Swagger). Essa é uma ótima ideia para testes, desenvolvimento e prototipagem! Aqui está uma análise de como você pode conseguir isso, juntamente com exemplos de código e explicações. Vou me concentrar no uso de Python e da biblioteca `connexion`, pois é uma escolha popular e eficaz para essa tarefa. **Conceitos Principais** * **Especificação OpenAPI:** Este é o contrato que define sua API. Ele descreve os endpoints, formatos de solicitação/resposta, tipos de dados e muito mais. Você precisará de um arquivo OpenAPI YAML ou JSON válido. * **Servidor Mock Control Plane (MCP):** Um servidor leve que simula o comportamento de sua API real. Ele retorna respostas predefinidas com base na especificação OpenAPI, permitindo que você teste clientes e integrações sem precisar do backend real. * **Biblioteca `connexion` (Python):** Um framework que simplifica a construção de APIs REST a partir de especificações OpenAPI. Ele lida com roteamento de solicitações, validação e serialização/desserialização com base na definição OpenAPI. **Passos para Gerar um Servidor MCP** 1. **Instale o `connexion`:** ```bash pip install connexion ``` 2. **Crie um Arquivo Python (por exemplo, `mcp_server.py`):** ```python import connexion import logging import os # Configure o logging (opcional, mas recomendado) logging.basicConfig(level=logging.INFO) # Caminho para o seu arquivo de especificação OpenAPI OPENAPI_SPEC_PATH = 'openapi.yaml' # Substitua pelo caminho do seu arquivo # Função para lidar com as solicitações (implementação mock) def handle_request(operationId, **kwargs): """ Esta função é chamada quando um endpoint da API é atingido. Ela deve retornar uma resposta mock com base na especificação OpenAPI. """ logging.info(f"Lidando com a solicitação para operationId: {operationId}") logging.info(f"Parâmetros da solicitação: {kwargs}") # Implemente sua lógica mock aqui. Esta é a parte CHAVE. # Você precisará inspecionar o 'operationId' e 'kwargs' # para determinar qual resposta retornar. # Exemplo: Retornar uma resposta padrão para um operationId específico if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': 'Usuário Mock', 'email': f'usuario{user_id}@exemplo.com'} else: return {'message': 'ID do usuário obrigatório'}, 400 # Requisição Inválida # Exemplo: Retornar uma resposta diferente com base nos parâmetros da solicitação if operationId == 'createUser': request_body = kwargs.get('body') # Acesse o corpo da solicitação if request_body and 'name' in request_body and 'email' in request_body: return {'id': 123, 'name': request_body['name'], 'email': request_body['email']}, 201 # Criado else: return {'message': 'Corpo da solicitação inválido'}, 400 # Resposta padrão se nenhuma lógica específica for encontrada return {'message': 'Resposta mock para ' + operationId}, 200 def main(): # Crie um aplicativo Connexion app = connexion.App(__name__, specification_dir='./') # ou especifique o caminho completo # Leia a especificação OpenAPI app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) # Registre um resolvedor padrão para lidar com todos os operationIds # É aqui que a mágica acontece: Todas as solicitações são roteadas para handle_request() app.resolver = connexion.Resolver(lambda operationId: handle_request) # Inicie o servidor port = int(os.environ.get("PORT", 8080)) # Use a variável de ambiente PORT se disponível app.run(port=port) if __name__ == '__main__': main() ``` 3. **Crie um Arquivo de Especificação OpenAPI (por exemplo, `openapi.yaml`):** ```yaml openapi: 3.0.0 info: title: API Mock version: 1.0.0 paths: /users/{user_id}: get: summary: Obter um usuário por ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Operação bem-sucedida content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string /users: post: summary: Criar um novo usuário operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email responses: '201': description: Usuário criado com sucesso content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string ``` 4. **Execute o Servidor:** ```bash python mcp_server.py ``` **Explicação** * **`connexion.App`:** Cria o aplicativo WSGI. * **`app.add_api(OPENAPI_SPEC_PATH)`:** Analisa a especificação OpenAPI e configura as rotas. `pythonic_params=True` converte os nomes dos parâmetros para o estilo Python (por exemplo, `user-id` se torna `user_id`). * **`connexion.Resolver`:** Esta é a parte crucial. Ele diz ao Connexion *como* lidar com as solicitações recebidas. Em vez de mapear cada `operationId` para uma função específica, usamos um *resolvedor padrão*. A função lambda `lambda operationId: handle_request` significa que *cada* solicitação, independentemente do `operationId`, será roteada para a função `handle_request`. * **`handle_request(operationId, **kwargs)`:** Esta função é o coração do seu servidor mock. Ele recebe o `operationId` (o identificador exclusivo do endpoint na especificação OpenAPI) e quaisquer parâmetros de solicitação (`kwargs`). *Você* é responsável por implementar a lógica para retornar respostas mock apropriadas com base nessas informações. * **`kwargs`:** Este dicionário contém todos os parâmetros da solicitação: * Parâmetros de caminho (por exemplo, `user_id` em `/users/{user_id}`) * Parâmetros de consulta (por exemplo, `?page=1`) * Corpo da solicitação (acessível como `kwargs.get('body')` se a solicitação tiver um corpo) * Cabeçalhos (acessíveis via `request.headers` dentro da função `handle_request` se você precisar deles; você precisará importar `flask` e usar `from flask import request`) **Considerações Importantes e Melhorias** * **Geração de Dados Mock:** Para dados mock mais realistas, considere usar bibliotecas como `Faker` para gerar nomes, e-mails, endereços aleatórios, etc. Isso tornará seu servidor mock mais útil para testes. ```python from faker import Faker fake = Faker() def handle_request(operationId, **kwargs): if operationId == 'getUser': user_id = kwargs.get('user_id') return {'id': user_id, 'name': fake.name(), 'email': fake.email()} ``` * **Exemplos de Resposta do OpenAPI:** A especificação OpenAPI permite que você defina exemplos de respostas para cada endpoint. Você pode usar esses exemplos para gerar automaticamente respostas mock. Bibliotecas como `swagger_ui_bundle` podem ajudar com isso. No entanto, integrar isso diretamente no resolvedor `connexion` requer uma personalização mais avançada. * **Configuração:** Use variáveis de ambiente ou um arquivo de configuração para gerenciar configurações como o caminho da especificação OpenAPI, número da porta e outras opções. * **Tratamento de Erros:** Implemente o tratamento de erros adequado em sua função `handle_request`. Retorne códigos de status HTTP apropriados (400, 404, 500, etc.) e mensagens de erro quando necessário. * **Segurança:** Se sua API tiver requisitos de segurança (por exemplo, autenticação), você precisará implementar verificações de segurança mock em sua função `handle_request`. Isso pode envolver a verificação de cabeçalhos ou tokens específicos e o retorno de respostas apropriadas. * **Testes:** Escreva testes de unidade para verificar se seu servidor mock está retornando as respostas corretas para diferentes solicitações. * **Respostas Dinâmicas:** Para cenários mais complexos, você pode precisar armazenar algum estado em seu servidor mock (por exemplo, uma lista de usuários). Isso permitirá que você simule um comportamento mais realista, como criar novos recursos e atualizar os existentes. * **Bibliotecas Alternativas:** Embora `connexion` seja uma boa escolha, outras bibliotecas como `Flask` ou `FastAPI` também podem ser usadas para construir servidores mock. A chave é analisar a especificação OpenAPI e usá-la para gerar as rotas e a lógica de resposta. **Exemplo com Faker e Exemplos de Resposta OpenAPI (Mais Avançado)** Este exemplo mostra como usar `Faker` para dados mais realistas e tenta aproveitar os exemplos de respostas da especificação OpenAPI (embora isso exija uma configuração mais manual). ```python import connexion import logging import os from faker import Faker import yaml # Para ler a especificação OpenAPI logging.basicConfig(level=logging.INFO) fake = Faker() OPENAPI_SPEC_PATH = 'openapi.yaml' def load_openapi_examples(spec_path): """Carrega exemplos de respostas da especificação OpenAPI.""" with open(spec_path, 'r') as f: spec = yaml.safe_load(f) examples = {} for path, path_data in spec.get('paths', {}).items(): for method, method_data in path_data.items(): operation_id = method_data.get('operationId') if operation_id: examples[operation_id] = {} for status_code, response_data in method_data.get('responses', {}).items(): content = response_data.get('content') if content and 'application/json' in content: example = content['application/json'].get('example') if example: examples[operation_id][status_code] = example return examples openapi_examples = load_openapi_examples(OPENAPI_SPEC_PATH) def handle_request(operationId, **kwargs): logging.info(f"Lidando com a solicitação para operationId: {operationId}") logging.info(f"Parâmetros da solicitação: {kwargs}") # Tente usar o exemplo de resposta da especificação OpenAPI if operationId in openapi_examples and '200' in openapi_examples[operationId]: logging.info(f"Usando o exemplo de resposta da OpenAPI para {operationId}") return openapi_examples[operationId]['200'], 200 # Retorna o exemplo e 200 OK if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': fake.name(), 'email': fake.email()} else: return {'message': 'ID do usuário obrigatório'}, 400 if operationId == 'createUser': request_body = kwargs.get('body') if request_body and 'name' in request_body and 'email' in request_body: return {'id': fake.random_int(), 'name': request_body['name'], 'email': request_body['email']}, 201 else: return {'message': 'Corpo da solicitação inválido'}, 400 return {'message': 'Resposta mock para ' + operationId}, 200 def main(): app = connexion.App(__name__, specification_dir='./') app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) app.resolver = connexion.Resolver(lambda operationId: handle_request) port = int(os.environ.get("PORT", 8080)) app.run(port=port) if __name__ == '__main__': main() ``` **Melhorias Chave no Exemplo Avançado:** * **`load_openapi_examples`:** Esta função analisa a especificação OpenAPI e extrai os valores `example` da seção `responses`. Ele os armazena em um dicionário indexado por `operationId` e código de status. * **Prioridade do Exemplo de Resposta:** A função `handle_request` agora verifica primeiro se há um exemplo de resposta definido na especificação OpenAPI para o `operationId` fornecido. Se sim, ele retorna esse exemplo. Isso permite que você defina facilmente respostas mock realistas em seu arquivo OpenAPI. * **Integração do Faker:** Se nenhum exemplo de resposta for encontrado, ele volta a usar o `Faker` para gerar dados aleatórios. **Para usar o exemplo avançado, você precisará adicionar campos `example` à sua especificação OpenAPI:** ```yaml openapi: 3.0.0 info: title: API Mock version: 1.0.0 paths: /users/{user_id}: get: summary: Obter um usuário por ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Operação bem-sucedida content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Adicione um exemplo de resposta aqui id: 123 name: John Doe email: john.doe@example.com '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string example: # Adicione um exemplo de resposta de erro aqui message: "ID de usuário inválido" /users: post: summary: Criar um novo usuário operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email example: # Adicione um exemplo de corpo de solicitação aqui name: Jane Smith email: jane.smith@example.com responses: '201': description: Usuário criado com sucesso content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Adicione um exemplo de resposta aqui id: 456 name: Jane Smith email: jane.smith@example.com '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string example: # Adicione um exemplo de resposta de erro aqui message: "Nome e e-mail são obrigatórios" ``` **Em resumo,** esta abordagem fornece uma maneira flexível e poderosa de gerar servidores MCP a partir de especificações OpenAPI. Ao combinar `connexion`, um resolvedor padrão, `Faker` e exemplos de respostas do seu arquivo OpenAPI, você pode criar APIs mock realistas e úteis para testes e desenvolvimento. Lembre-se de adaptar a função `handle_request` às necessidades específicas da sua API.

SWLC MCP Server

SWLC MCP Server

A lottery information query service for the Shanghai region that provides winning number lookups and analysis functions for various lottery games including Double Color Ball, 3D Lottery, and Seven Happiness Lottery.

AEM Block Collection MCP Server

AEM Block Collection MCP Server

Enables access to AEM block metadata by reading structured information from blocks.json files. Provides a simple tool to list available blocks with their descriptions, file paths, and counts.

freqtrade-mcp

freqtrade-mcp

A read-only MCP server that provides LLMs with introspection data and documentation for the Freqtrade codebase. It enables AI tools to access class signatures, method details, and configuration schemas to assist in writing more reliable trading strategies.

Tri-Tender Pricing MCP

Tri-Tender Pricing MCP

An MCP server designed to automate tender and RFQ pricing by extracting requirements from documents and building structured pricing models. It enables users to calculate final costs, compare market rates, and generate styled HTML pricing reports for PDF export.

@bitatlas/mcp-server

@bitatlas/mcp-server

Zero-Knowledge Cloud Drive for Humans and Agents. Client-side AES-256-GCM encryption with 7 MCP tools for secure file vault management — the server never sees plaintext data.

Ghost MCP Server

Ghost MCP Server

A Model Context Protocol server that enables management of Ghost blog content (posts, pages, and tags) through Claude, supporting both SSE and stdio transports.

Basecoat UI MCP

Basecoat UI MCP

Provides access to 77 pre-built, accessible Basecoat CSS UI components across forms, navigation, feedback, interactive, and layout categories, enabling AI assistants to retrieve HTML components and usage documentation for building user interfaces.

IBHack MCP Server

IBHack MCP Server

Enables intelligent discovery and recommendation of Python tools using Google Gemini AI. Automatically scans directories for tool classes and recommends the most relevant tools based on user queries with complete code generation.

Weather MCP Service

Weather MCP Service

A Model Control Protocol (MCP) based service that allows users to query weather forecasts by coordinates and receive weather alerts for U.S. states.