Discover Awesome MCP Servers

Extend your agent with 26,519 capabilities via MCP servers.

All26,519
Nano Banana MCP Server

Nano Banana MCP Server

Enables AI image generation, editing, and composition using Google's Gemini image models (Nano Banana Pro and Nano Banana). Supports text-to-image generation, multi-image composition, flexible aspect ratios, high-resolution output up to 4K, and real-time information grounding.

MCP Declarative Server

MCP Declarative Server

A utility module for creating Model Context Protocol servers declaratively, allowing developers to easily define tools, prompts, and resources with a simplified syntax.

Zaifer-MCP

Zaifer-MCP

Enables LLM assistants like Claude to interact with Zaif cryptocurrency exchange through natural language, supporting market information retrieval, chart data, trading, and account management for BTC/JPY, ETH/JPY, and XYM/JPY pairs.

Code Mode Bridge

Code Mode Bridge

An MCP server that aggregates multiple upstream MCP tools into a single 'codemode' tool for unified orchestration and execution. It enables agents to run generated code in an isolated sandbox to interact with various connected services through a single interface.

Hubble MCP Server

Hubble MCP Server

A Python-based Model Context Protocol server that integrates with Claude Desktop, allowing users to connect to Hubble API services by configuring the server with their Hubble API key.

arXiv MCP Server

arXiv MCP Server

Enables interaction with arXiv.org to search scholarly articles, retrieve metadata, download PDFs, and load article content directly into LLM context for analysis.

Jina AI Remote MCP Server

Jina AI Remote MCP Server

Provides access to Jina AI's suite of tools including web search, URL reading, image search, embeddings, and reranking capabilities. Enables users to extract web content as markdown, search academic papers, capture screenshots, and perform semantic operations through natural language.

hwpConverMdMCP

hwpConverMdMCP

An MCP server that enables LLMs to convert HWP and HWPX documents into Markdown for analysis and processing. It supports document conversion via local file paths or Base64 content across various MCP-compatible clients.

Artsy Analytics MCP Server

Artsy Analytics MCP Server

A Model Context Protocol server that provides Artsy partner analytics tools for Claude Desktop, allowing users to query gallery metrics, sales data, audience insights, and content performance through natural language.

Workflow MCP Server

Workflow MCP Server

MCP Evolution API Supergateway

MCP Evolution API Supergateway

Servidor MCP SSE para API de evolução do WhatsApp

MCP Terminal Server

MCP Terminal Server

A simple MCP server that allows running terminal commands with output capture, enabling command execution on the host system from MCP-compatible clients like Claude Desktop.

MCP Server Builder

MCP Server Builder

Provides searchable access to official MCP protocol specification and FastMCP framework documentation, helping developers build correct MCP servers by querying live documentation with BM25 full-text search.

Local DeepWiki MCP Server

Local DeepWiki MCP Server

Generates DeepWiki-style documentation for private code repositories with RAG-based Q\&A capabilities, semantic code search, and multi-language AST parsing. Supports local LLMs (Ollama) or cloud providers for privacy-focused codebase analysis.

Aedifion MCP Server

Aedifion MCP Server

Enables AI assistants to interact with the aedifion cloud platform for building performance optimization and IoT data management. It provides over 95 tools for monitoring timeseries data, managing project components, and executing building analytics or controls.

Create MCP Server

Create MCP Server

Template to quickly set up your own MCP server

Apple Health MCP Server

Apple Health MCP Server

An MCP server that allows users to query and analyze their Apple Health data using SQL and natural language, utilizing DuckDB for fast and efficient health data analysis.

YouTube Content Management MCP Server

YouTube Content Management MCP Server

Enables AI assistants to search YouTube for videos, channels, and playlists while retrieving detailed analytics and metrics through the YouTube Data API v3. Supports advanced filtering options and provides comprehensive statistics for content discovery and analysis.

Yak MCP

Yak MCP

Enables coding agents to speak aloud using text-to-speech functionality. Works with agents running inside devcontainers and provides configurable voice settings for creating chatty AI companions.

Slack MCP Server

Slack MCP Server

A Model Context Protocol server that enables LLMs to interact with Slack workspaces through OAuth 2.0 authentication. It provides tools for listing channels and posting messages while supporting secure token persistence and dynamic client registration.

Customs Big Data MCP Server

Customs Big Data MCP Server

Provides comprehensive import/export trade data queries including export trends, product category statistics, order geographic distribution, and overseas certification information to help users understand enterprises' international trade situations.

Reddit Buddy MCP

Reddit Buddy MCP

Enables AI assistants to browse Reddit, search posts, analyze user activity, and fetch comments without requiring API keys. Features smart caching, clean data responses, and optional authentication for higher rate limits.

Emcee

Emcee

I understand you're looking for a way to automatically generate an **Mock Control Plane (MCP) server** from an OpenAPI (formerly Swagger) specification. This is a great idea for testing, development, and prototyping! Here's a breakdown of how you can achieve this, along with code examples and explanations. I'll focus on using Python and the `connexion` library, as it's a popular and effective choice for this task. **Core Concepts** * **OpenAPI Specification:** This is the contract that defines your API. It describes the endpoints, request/response formats, data types, and more. You'll need a valid OpenAPI YAML or JSON file. * **Mock Control Plane (MCP) Server:** A lightweight server that simulates the behavior of your real API. It returns pre-defined responses based on the OpenAPI specification, allowing you to test clients and integrations without needing the actual backend. * **`connexion` Library (Python):** A framework that simplifies building REST APIs from OpenAPI specifications. It handles request routing, validation, and serialization/deserialization based on the OpenAPI definition. **Steps to Generate an MCP Server** 1. **Install `connexion`:** ```bash pip install connexion ``` 2. **Create a Python File (e.g., `mcp_server.py`):** ```python import connexion import logging import os # Configure logging (optional, but recommended) logging.basicConfig(level=logging.INFO) # Path to your OpenAPI specification file OPENAPI_SPEC_PATH = 'openapi.yaml' # Replace with your file path # Function to handle requests (mock implementation) def handle_request(operationId, **kwargs): """ This function is called when an API endpoint is hit. It should return a mock response based on the OpenAPI spec. """ logging.info(f"Handling request for operationId: {operationId}") logging.info(f"Request parameters: {kwargs}") # Implement your mock logic here. This is the KEY part. # You'll need to inspect the 'operationId' and 'kwargs' # to determine what response to return. # Example: Return a default response for a specific operationId if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': 'Mock User', 'email': f'user{user_id}@example.com'} else: return {'message': 'User ID required'}, 400 # Bad Request # Example: Return a different response based on request parameters if operationId == 'createUser': request_body = kwargs.get('body') # Access the request body if request_body and 'name' in request_body and 'email' in request_body: return {'id': 123, 'name': request_body['name'], 'email': request_body['email']}, 201 # Created else: return {'message': 'Invalid request body'}, 400 # Default response if no specific logic is found return {'message': 'Mock response for ' + operationId}, 200 def main(): # Create a Connexion app app = connexion.App(__name__, specification_dir='./') # or specify the full path # Read the OpenAPI specification app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) # Register a default resolver to handle all operationIds # This is where the magic happens: All requests are routed to handle_request() app.resolver = connexion.Resolver(lambda operationId: handle_request) # Start the server port = int(os.environ.get("PORT", 8080)) # Use PORT environment variable if available app.run(port=port) if __name__ == '__main__': main() ``` 3. **Create an OpenAPI Specification File (e.g., `openapi.yaml`):** ```yaml openapi: 3.0.0 info: title: Mock API version: 1.0.0 paths: /users/{user_id}: get: summary: Get a user by ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Successful operation content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Bad Request content: application/json: schema: type: object properties: message: type: string /users: post: summary: Create a new user operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email responses: '201': description: User created successfully content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Invalid request content: application/json: schema: type: object properties: message: type: string ``` 4. **Run the Server:** ```bash python mcp_server.py ``` **Explanation** * **`connexion.App`:** Creates the WSGI application. * **`app.add_api(OPENAPI_SPEC_PATH)`:** Parses the OpenAPI specification and sets up the routes. `pythonic_params=True` converts parameter names to Python-style (e.g., `user-id` becomes `user_id`). * **`connexion.Resolver`:** This is the crucial part. It tells Connexion *how* to handle incoming requests. Instead of mapping each `operationId` to a specific function, we use a *default resolver*. The lambda function `lambda operationId: handle_request` means that *every* request, regardless of the `operationId`, will be routed to the `handle_request` function. * **`handle_request(operationId, **kwargs)`:** This function is the heart of your mock server. It receives the `operationId` (the unique identifier for the endpoint in the OpenAPI spec) and any request parameters (`kwargs`). *You* are responsible for implementing the logic to return appropriate mock responses based on this information. * **`kwargs`:** This dictionary contains all the request parameters: * Path parameters (e.g., `user_id` in `/users/{user_id}`) * Query parameters (e.g., `?page=1`) * Request body (accessible as `kwargs.get('body')` if the request has a body) * Headers (accessible via `request.headers` within the `handle_request` function if you need them; you'll need to import `flask` and use `from flask import request`) **Important Considerations and Improvements** * **Mock Data Generation:** For more realistic mock data, consider using libraries like `Faker` to generate random names, emails, addresses, etc. This will make your mock server more useful for testing. ```python from faker import Faker fake = Faker() def handle_request(operationId, **kwargs): if operationId == 'getUser': user_id = kwargs.get('user_id') return {'id': user_id, 'name': fake.name(), 'email': fake.email()} ``` * **Response Examples from OpenAPI:** The OpenAPI specification allows you to define example responses for each endpoint. You can use these examples to automatically generate mock responses. Libraries like `swagger_ui_bundle` can help with this. However, integrating this directly into the `connexion` resolver requires more advanced customization. * **Configuration:** Use environment variables or a configuration file to manage settings like the OpenAPI specification path, port number, and other options. * **Error Handling:** Implement proper error handling in your `handle_request` function. Return appropriate HTTP status codes (400, 404, 500, etc.) and error messages when necessary. * **Security:** If your API has security requirements (e.g., authentication), you'll need to implement mock security checks in your `handle_request` function. This might involve checking for specific headers or tokens and returning appropriate responses. * **Testing:** Write unit tests to verify that your mock server is returning the correct responses for different requests. * **Dynamic Responses:** For more complex scenarios, you might need to store some state in your mock server (e.g., a list of users). This will allow you to simulate more realistic behavior, such as creating new resources and updating existing ones. * **Alternative Libraries:** While `connexion` is a good choice, other libraries like `Flask` or `FastAPI` can also be used to build mock servers. The key is to parse the OpenAPI specification and use it to generate the routes and response logic. **Example with Faker and OpenAPI Example Responses (More Advanced)** This example shows how to use `Faker` for more realistic data and attempts to leverage example responses from the OpenAPI spec (though this requires more manual setup). ```python import connexion import logging import os from faker import Faker import yaml # For reading the OpenAPI spec logging.basicConfig(level=logging.INFO) fake = Faker() OPENAPI_SPEC_PATH = 'openapi.yaml' def load_openapi_examples(spec_path): """Loads example responses from the OpenAPI spec.""" with open(spec_path, 'r') as f: spec = yaml.safe_load(f) examples = {} for path, path_data in spec.get('paths', {}).items(): for method, method_data in path_data.items(): operation_id = method_data.get('operationId') if operation_id: examples[operation_id] = {} for status_code, response_data in method_data.get('responses', {}).items(): content = response_data.get('content') if content and 'application/json' in content: example = content['application/json'].get('example') if example: examples[operation_id][status_code] = example return examples openapi_examples = load_openapi_examples(OPENAPI_SPEC_PATH) def handle_request(operationId, **kwargs): logging.info(f"Handling request for operationId: {operationId}") logging.info(f"Request parameters: {kwargs}") # Try to use example response from OpenAPI spec if operationId in openapi_examples and '200' in openapi_examples[operationId]: logging.info(f"Using example response from OpenAPI for {operationId}") return openapi_examples[operationId]['200'], 200 # Return example and 200 OK if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': fake.name(), 'email': fake.email()} else: return {'message': 'User ID required'}, 400 if operationId == 'createUser': request_body = kwargs.get('body') if request_body and 'name' in request_body and 'email' in request_body: return {'id': fake.random_int(), 'name': request_body['name'], 'email': request_body['email']}, 201 else: return {'message': 'Invalid request body'}, 400 return {'message': 'Mock response for ' + operationId}, 200 def main(): app = connexion.App(__name__, specification_dir='./') app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) app.resolver = connexion.Resolver(lambda operationId: handle_request) port = int(os.environ.get("PORT", 8080)) app.run(port=port) if __name__ == '__main__': main() ``` **Key Improvements in the Advanced Example:** * **`load_openapi_examples`:** This function parses the OpenAPI specification and extracts the `example` values from the `responses` section. It stores them in a dictionary keyed by `operationId` and status code. * **Example Response Priority:** The `handle_request` function now first checks if there's an example response defined in the OpenAPI spec for the given `operationId`. If so, it returns that example. This allows you to easily define realistic mock responses in your OpenAPI file. * **Faker Integration:** If no example response is found, it falls back to using `Faker` to generate random data. **To use the advanced example, you'll need to add `example` fields to your OpenAPI specification:** ```yaml openapi: 3.0.0 info: title: Mock API version: 1.0.0 paths: /users/{user_id}: get: summary: Get a user by ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Successful operation content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Add an example response here id: 123 name: John Doe email: john.doe@example.com '400': description: Bad Request content: application/json: schema: type: object properties: message: type: string example: # Add an example error response here message: "Invalid user ID" /users: post: summary: Create a new user operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email example: # Add an example request body here name: Jane Smith email: jane.smith@example.com responses: '201': description: User created successfully content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Add an example response here id: 456 name: Jane Smith email: jane.smith@example.com '400': description: Invalid request content: application/json: schema: type: object properties: message: type: string example: # Add an example error response here message: "Name and email are required" ``` **In summary,** this approach provides a flexible and powerful way to generate MCP servers from OpenAPI specifications. By combining `connexion`, a default resolver, `Faker`, and example responses from your OpenAPI file, you can create realistic and useful mock APIs for testing and development. Remember to tailor the `handle_request` function to your specific API's needs. **Tradução para Português:** Eu entendo que você está procurando uma maneira de gerar automaticamente um **servidor Mock Control Plane (MCP)** a partir de uma especificação OpenAPI (anteriormente Swagger). Essa é uma ótima ideia para testes, desenvolvimento e prototipagem! Aqui está uma análise de como você pode conseguir isso, juntamente com exemplos de código e explicações. Vou me concentrar no uso de Python e da biblioteca `connexion`, pois é uma escolha popular e eficaz para essa tarefa. **Conceitos Principais** * **Especificação OpenAPI:** Este é o contrato que define sua API. Ele descreve os endpoints, formatos de solicitação/resposta, tipos de dados e muito mais. Você precisará de um arquivo OpenAPI YAML ou JSON válido. * **Servidor Mock Control Plane (MCP):** Um servidor leve que simula o comportamento de sua API real. Ele retorna respostas predefinidas com base na especificação OpenAPI, permitindo que você teste clientes e integrações sem precisar do backend real. * **Biblioteca `connexion` (Python):** Um framework que simplifica a construção de APIs REST a partir de especificações OpenAPI. Ele lida com roteamento de solicitações, validação e serialização/desserialização com base na definição OpenAPI. **Passos para Gerar um Servidor MCP** 1. **Instale o `connexion`:** ```bash pip install connexion ``` 2. **Crie um Arquivo Python (por exemplo, `mcp_server.py`):** ```python import connexion import logging import os # Configure o logging (opcional, mas recomendado) logging.basicConfig(level=logging.INFO) # Caminho para o seu arquivo de especificação OpenAPI OPENAPI_SPEC_PATH = 'openapi.yaml' # Substitua pelo caminho do seu arquivo # Função para lidar com as solicitações (implementação mock) def handle_request(operationId, **kwargs): """ Esta função é chamada quando um endpoint da API é atingido. Ela deve retornar uma resposta mock com base na especificação OpenAPI. """ logging.info(f"Lidando com a solicitação para operationId: {operationId}") logging.info(f"Parâmetros da solicitação: {kwargs}") # Implemente sua lógica mock aqui. Esta é a parte CHAVE. # Você precisará inspecionar o 'operationId' e 'kwargs' # para determinar qual resposta retornar. # Exemplo: Retornar uma resposta padrão para um operationId específico if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': 'Usuário Mock', 'email': f'usuario{user_id}@exemplo.com'} else: return {'message': 'ID do usuário obrigatório'}, 400 # Requisição Inválida # Exemplo: Retornar uma resposta diferente com base nos parâmetros da solicitação if operationId == 'createUser': request_body = kwargs.get('body') # Acesse o corpo da solicitação if request_body and 'name' in request_body and 'email' in request_body: return {'id': 123, 'name': request_body['name'], 'email': request_body['email']}, 201 # Criado else: return {'message': 'Corpo da solicitação inválido'}, 400 # Resposta padrão se nenhuma lógica específica for encontrada return {'message': 'Resposta mock para ' + operationId}, 200 def main(): # Crie um aplicativo Connexion app = connexion.App(__name__, specification_dir='./') # ou especifique o caminho completo # Leia a especificação OpenAPI app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) # Registre um resolvedor padrão para lidar com todos os operationIds # É aqui que a mágica acontece: Todas as solicitações são roteadas para handle_request() app.resolver = connexion.Resolver(lambda operationId: handle_request) # Inicie o servidor port = int(os.environ.get("PORT", 8080)) # Use a variável de ambiente PORT se disponível app.run(port=port) if __name__ == '__main__': main() ``` 3. **Crie um Arquivo de Especificação OpenAPI (por exemplo, `openapi.yaml`):** ```yaml openapi: 3.0.0 info: title: API Mock version: 1.0.0 paths: /users/{user_id}: get: summary: Obter um usuário por ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Operação bem-sucedida content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string /users: post: summary: Criar um novo usuário operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email responses: '201': description: Usuário criado com sucesso content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string ``` 4. **Execute o Servidor:** ```bash python mcp_server.py ``` **Explicação** * **`connexion.App`:** Cria o aplicativo WSGI. * **`app.add_api(OPENAPI_SPEC_PATH)`:** Analisa a especificação OpenAPI e configura as rotas. `pythonic_params=True` converte os nomes dos parâmetros para o estilo Python (por exemplo, `user-id` se torna `user_id`). * **`connexion.Resolver`:** Esta é a parte crucial. Ele diz ao Connexion *como* lidar com as solicitações recebidas. Em vez de mapear cada `operationId` para uma função específica, usamos um *resolvedor padrão*. A função lambda `lambda operationId: handle_request` significa que *cada* solicitação, independentemente do `operationId`, será roteada para a função `handle_request`. * **`handle_request(operationId, **kwargs)`:** Esta função é o coração do seu servidor mock. Ele recebe o `operationId` (o identificador exclusivo do endpoint na especificação OpenAPI) e quaisquer parâmetros de solicitação (`kwargs`). *Você* é responsável por implementar a lógica para retornar respostas mock apropriadas com base nessas informações. * **`kwargs`:** Este dicionário contém todos os parâmetros da solicitação: * Parâmetros de caminho (por exemplo, `user_id` em `/users/{user_id}`) * Parâmetros de consulta (por exemplo, `?page=1`) * Corpo da solicitação (acessível como `kwargs.get('body')` se a solicitação tiver um corpo) * Cabeçalhos (acessíveis via `request.headers` dentro da função `handle_request` se você precisar deles; você precisará importar `flask` e usar `from flask import request`) **Considerações Importantes e Melhorias** * **Geração de Dados Mock:** Para dados mock mais realistas, considere usar bibliotecas como `Faker` para gerar nomes, e-mails, endereços aleatórios, etc. Isso tornará seu servidor mock mais útil para testes. ```python from faker import Faker fake = Faker() def handle_request(operationId, **kwargs): if operationId == 'getUser': user_id = kwargs.get('user_id') return {'id': user_id, 'name': fake.name(), 'email': fake.email()} ``` * **Exemplos de Resposta do OpenAPI:** A especificação OpenAPI permite que você defina exemplos de respostas para cada endpoint. Você pode usar esses exemplos para gerar automaticamente respostas mock. Bibliotecas como `swagger_ui_bundle` podem ajudar com isso. No entanto, integrar isso diretamente no resolvedor `connexion` requer uma personalização mais avançada. * **Configuração:** Use variáveis de ambiente ou um arquivo de configuração para gerenciar configurações como o caminho da especificação OpenAPI, número da porta e outras opções. * **Tratamento de Erros:** Implemente o tratamento de erros adequado em sua função `handle_request`. Retorne códigos de status HTTP apropriados (400, 404, 500, etc.) e mensagens de erro quando necessário. * **Segurança:** Se sua API tiver requisitos de segurança (por exemplo, autenticação), você precisará implementar verificações de segurança mock em sua função `handle_request`. Isso pode envolver a verificação de cabeçalhos ou tokens específicos e o retorno de respostas apropriadas. * **Testes:** Escreva testes de unidade para verificar se seu servidor mock está retornando as respostas corretas para diferentes solicitações. * **Respostas Dinâmicas:** Para cenários mais complexos, você pode precisar armazenar algum estado em seu servidor mock (por exemplo, uma lista de usuários). Isso permitirá que você simule um comportamento mais realista, como criar novos recursos e atualizar os existentes. * **Bibliotecas Alternativas:** Embora `connexion` seja uma boa escolha, outras bibliotecas como `Flask` ou `FastAPI` também podem ser usadas para construir servidores mock. A chave é analisar a especificação OpenAPI e usá-la para gerar as rotas e a lógica de resposta. **Exemplo com Faker e Exemplos de Resposta OpenAPI (Mais Avançado)** Este exemplo mostra como usar `Faker` para dados mais realistas e tenta aproveitar os exemplos de respostas da especificação OpenAPI (embora isso exija uma configuração mais manual). ```python import connexion import logging import os from faker import Faker import yaml # Para ler a especificação OpenAPI logging.basicConfig(level=logging.INFO) fake = Faker() OPENAPI_SPEC_PATH = 'openapi.yaml' def load_openapi_examples(spec_path): """Carrega exemplos de respostas da especificação OpenAPI.""" with open(spec_path, 'r') as f: spec = yaml.safe_load(f) examples = {} for path, path_data in spec.get('paths', {}).items(): for method, method_data in path_data.items(): operation_id = method_data.get('operationId') if operation_id: examples[operation_id] = {} for status_code, response_data in method_data.get('responses', {}).items(): content = response_data.get('content') if content and 'application/json' in content: example = content['application/json'].get('example') if example: examples[operation_id][status_code] = example return examples openapi_examples = load_openapi_examples(OPENAPI_SPEC_PATH) def handle_request(operationId, **kwargs): logging.info(f"Lidando com a solicitação para operationId: {operationId}") logging.info(f"Parâmetros da solicitação: {kwargs}") # Tente usar o exemplo de resposta da especificação OpenAPI if operationId in openapi_examples and '200' in openapi_examples[operationId]: logging.info(f"Usando o exemplo de resposta da OpenAPI para {operationId}") return openapi_examples[operationId]['200'], 200 # Retorna o exemplo e 200 OK if operationId == 'getUser': user_id = kwargs.get('user_id') if user_id: return {'id': user_id, 'name': fake.name(), 'email': fake.email()} else: return {'message': 'ID do usuário obrigatório'}, 400 if operationId == 'createUser': request_body = kwargs.get('body') if request_body and 'name' in request_body and 'email' in request_body: return {'id': fake.random_int(), 'name': request_body['name'], 'email': request_body['email']}, 201 else: return {'message': 'Corpo da solicitação inválido'}, 400 return {'message': 'Resposta mock para ' + operationId}, 200 def main(): app = connexion.App(__name__, specification_dir='./') app.add_api(OPENAPI_SPEC_PATH, pythonic_params=True) app.resolver = connexion.Resolver(lambda operationId: handle_request) port = int(os.environ.get("PORT", 8080)) app.run(port=port) if __name__ == '__main__': main() ``` **Melhorias Chave no Exemplo Avançado:** * **`load_openapi_examples`:** Esta função analisa a especificação OpenAPI e extrai os valores `example` da seção `responses`. Ele os armazena em um dicionário indexado por `operationId` e código de status. * **Prioridade do Exemplo de Resposta:** A função `handle_request` agora verifica primeiro se há um exemplo de resposta definido na especificação OpenAPI para o `operationId` fornecido. Se sim, ele retorna esse exemplo. Isso permite que você defina facilmente respostas mock realistas em seu arquivo OpenAPI. * **Integração do Faker:** Se nenhum exemplo de resposta for encontrado, ele volta a usar o `Faker` para gerar dados aleatórios. **Para usar o exemplo avançado, você precisará adicionar campos `example` à sua especificação OpenAPI:** ```yaml openapi: 3.0.0 info: title: API Mock version: 1.0.0 paths: /users/{user_id}: get: summary: Obter um usuário por ID operationId: getUser parameters: - name: user_id in: path required: true schema: type: integer responses: '200': description: Operação bem-sucedida content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Adicione um exemplo de resposta aqui id: 123 name: John Doe email: john.doe@example.com '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string example: # Adicione um exemplo de resposta de erro aqui message: "ID de usuário inválido" /users: post: summary: Criar um novo usuário operationId: createUser requestBody: required: true content: application/json: schema: type: object properties: name: type: string email: type: string required: - name - email example: # Adicione um exemplo de corpo de solicitação aqui name: Jane Smith email: jane.smith@example.com responses: '201': description: Usuário criado com sucesso content: application/json: schema: type: object properties: id: type: integer name: type: string email: type: string example: # Adicione um exemplo de resposta aqui id: 456 name: Jane Smith email: jane.smith@example.com '400': description: Requisição Inválida content: application/json: schema: type: object properties: message: type: string example: # Adicione um exemplo de resposta de erro aqui message: "Nome e e-mail são obrigatórios" ``` **Em resumo,** esta abordagem fornece uma maneira flexível e poderosa de gerar servidores MCP a partir de especificações OpenAPI. Ao combinar `connexion`, um resolvedor padrão, `Faker` e exemplos de respostas do seu arquivo OpenAPI, você pode criar APIs mock realistas e úteis para testes e desenvolvimento. Lembre-se de adaptar a função `handle_request` às necessidades específicas da sua API.

MCP-BAMM

MCP-BAMM

A Model Context Protocol server that enables interaction with Borrow Automated Market Maker (BAMM) contracts on the Fraxtal blockchain, allowing users to manage positions, borrow against LP tokens, and perform other BAMM-related operations.

Skill Management MCP Server

Skill Management MCP Server

Enables Claude to create, edit, run, and manage reusable skills stored locally, including executing scripts with automatic dependency management and environment variables. Works across all MCP-compatible clients like Cursor, Claude Desktop, and claude.ai.

BlenderMCP

BlenderMCP

Connects Blender to Claude AI through the Model Context Protocol (MCP), enabling prompt-assisted 3D modeling, scene creation, and manipulation.

ZeroTrusted-ai PII Detection Agent

ZeroTrusted-ai PII Detection Agent

ZeroTrusted-ai PII Detection Agent

Mind Map MCP

Mind Map MCP

Automatically generates mind map images from text content using Coze API workflow. Returns accessible image links that can be used in MCP-compatible tools like CodeBuddy, Cursor, and Qoder.

Bargainer MCP Server

Bargainer MCP Server

A Model Context Protocol server that aggregates and compares deals from multiple sources including Slickdeals, RapidAPI marketplace, and web scraping, enabling users to search, filter, and compare deals through a chat interface.

Gemini Context MCP Server

Gemini Context MCP Server

Espelho de