Discover Awesome MCP Servers

Extend your agent with 14,324 capabilities via MCP servers.

All14,324
PostEx MCP Server

PostEx MCP Server

Servidor MCP que fornece ferramentas para interagir com a API Merchant da PostEx para gerenciamento de pedidos, rastreamento e outras operações logísticas.

Zoom API MCP Server

Zoom API MCP Server

A comprehensive Model Context Protocol server that enables interaction with the full suite of Zoom API endpoints, providing structured tools with proper validation and OAuth 2.0 authentication for managing meetings, users, webinars, and other Zoom resources.

protolint-mcp

protolint-mcp

protolint-mcp

Mcp Server

Mcp Server

mcp-nckuhub-server

mcp-nckuhub-server

Salesforce MCP

Salesforce MCP

Salesforce MCP

Atla MCP Server

Atla MCP Server

Uma implementação de servidor MCP que fornece uma interface padronizada para LLMs interagirem com a API Atla.

Firestore Advanced MCP

Firestore Advanced MCP

Um servidor de Protocolo de Contexto de Modelo que permite que grandes modelos de linguagem, como o Claude, realizem interações abrangentes com bancos de dados Firebase Firestore, suportando operações CRUD completas, consultas complexas e recursos avançados como transações e gerenciamento de TTL (Time-to-Live).

FastMCP SonarQube Metrics

FastMCP SonarQube Metrics

A server that provides tools for retrieving SonarQube project metrics and quality data through a simplified message-based approach, allowing users to programmatically access metrics, historical data, and component-level information from SonarQube.

Edit-MCP

Edit-MCP

A Model Context Protocol server that integrates with Microsoft's Edit tool, allowing AI systems to perform file operations from simple reads/writes to complex code editing and refactoring.

Calculator MCP Server

Calculator MCP Server

Provides basic arithmetic operations and advanced mathematical functions through the Model Context Protocol (MCP), with features like calculation history tracking and expression evaluation.

Japanese Weather MCP Server

Japanese Weather MCP Server

A Model Context Protocol (MCP) server that provides access to Japanese weather forecasts using the weather.tsukumijima.net API.

Groq MCP Server

Groq MCP Server

Servidor MCP Groq

MCP Multi-API Server

MCP Multi-API Server

A bridge allowing AI/LLMs to seamlessly interact with external APIs for weather, finance, and news services through a standardized MCP-compliant interface.

Sond Core API MCP Server

Sond Core API MCP Server

Enables interaction with the Sond Core API through the Model Context Protocol. Auto-generated from the OpenAPI specification at https://core-api-lb.sond.com/api-json to provide programmatic access to Sond's services.

AFL (Australian Football League) MCP Server

AFL (Australian Football League) MCP Server

Este é um servidor de Protocolo de Contexto de Modelo (MCP) que fornece dados da AFL (Australian Football League, ou Liga Australiana de Futebol) da API Squiggle.

GitHub Repos Manager MCP Server

GitHub Repos Manager MCP Server

GitHub Repos Manager MCP Server

Azure MCP Server

Azure MCP Server

Permite a interação em linguagem natural com os serviços do Azure através do Claude Desktop, suportando o gerenciamento de recursos, o tratamento de assinaturas e a seleção de tenants com autenticação segura.

Project Explorer MCP Server

Project Explorer MCP Server

Provides tools for analyzing project structures, searching through codebases, managing dependencies, and performing file operations with advanced filtering capabilities.

MCP_claude

MCP_claude

Isto é para demonstrar como um servidor MCP pode ser construído para o Cliente MCP do Claude Desktop.

Memory Bank MCP Server

Memory Bank MCP Server

Provides a structured documentation system for context preservation in AI assistant environments, helping users create and manage memory banks for their projects.

MCP Server with Gemini AI Integration

MCP Server with Gemini AI Integration

Cliente de servidor MCP com ferramentas básicas

crawl4ai-mcp

crawl4ai-mcp

Here's a Python outline for creating an MCP (Model Context Protocol) server that wraps the Crawl4AI library, along with explanations and considerations: ```python from http.server import BaseHTTPRequestHandler, HTTPServer import json import logging # Assuming Crawl4AI is installed and importable try: from crawl4ai import Crawl4AI # Replace with the actual import if different except ImportError: print("Error: Crawl4AI library not found. Please install it.") Crawl4AI = None # Disable functionality if library is missing # Configure logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # --- Configuration --- HOST_NAME = "localhost" # Or "0.0.0.0" to listen on all interfaces PORT_NUMBER = 8080 # --- MCP Request Handler --- class MCPRequestHandler(BaseHTTPRequestHandler): def _set_response(self, status_code=200, content_type="application/json"): self.send_response(status_code) self.send_header("Content-type", content_type) self.end_headers() def do_POST(self): """Handles POST requests, expecting JSON data.""" content_length = int(self.headers['Content-Length']) post_data = self.rfile.read(content_length) try: request_data = json.loads(post_data.decode('utf-8')) logging.info(f"Received request: {request_data}") except json.JSONDecodeError: self._set_response(400) self.wfile.write(json.dumps({"error": "Invalid JSON"}).encode('utf-8')) return # Route the request based on the 'action' field (or similar) action = request_data.get('action') if action == "crawl": self.handle_crawl_request(request_data) elif action == "extract_data": self.handle_extract_data_request(request_data) # Example else: self._set_response(400) self.wfile.write(json.dumps({"error": "Invalid action"}).encode('utf-8')) def handle_crawl_request(self, request_data): """Handles a crawl request using Crawl4AI.""" if Crawl4AI is None: self._set_response(500) self.wfile.write(json.dumps({"error": "Crawl4AI library not available"}).encode('utf-8')) return url = request_data.get('url') if not url: self._set_response(400) self.wfile.write(json.dumps({"error": "Missing 'url' parameter"}).encode('utf-8')) return try: # Initialize Crawl4AI (adjust parameters as needed) crawler = Crawl4AI() # You might need API keys or other setup here # Perform the crawl result = crawler.crawl(url) # Assuming a crawl method exists # Prepare the response response_data = {"status": "success", "data": result} self._set_response(200) self.wfile.write(json.dumps(response_data).encode('utf-8')) except Exception as e: logging.exception("Error during crawl:") self._set_response(500) self.wfile.write(json.dumps({"error": str(e)}).encode('utf-8')) def handle_extract_data_request(self, request_data): """Example: Handles a data extraction request (if Crawl4AI supports it).""" # Implement data extraction logic here, using Crawl4AI functions. # This is just a placeholder. Adapt to Crawl4AI's capabilities. self._set_response(501) # Not Implemented self.wfile.write(json.dumps({"error": "Data extraction not implemented"}).encode('utf-8')) if __name__ == '__main__': if Crawl4AI is None: print("Crawl4AI library is missing. Server will not start.") else: webServer = HTTPServer((HOST_NAME, PORT_NUMBER), MCPRequestHandler) print(f"Server started http://{HOST_NAME}:{PORT_NUMBER}") try: webServer.serve_forever() except KeyboardInterrupt: pass webServer.server_close() print("Server stopped.") ``` Key improvements and explanations: * **Error Handling:** Includes `try...except` blocks to catch potential errors during JSON parsing, Crawl4AI execution, and other operations. Logs exceptions for debugging. Returns appropriate HTTP status codes (400 for bad requests, 500 for server errors). Crucially, it checks if `Crawl4AI` was successfully imported and handles the case where it's missing. * **JSON Handling:** Correctly decodes the POST data from bytes to a string using UTF-8 encoding and encodes the response back to bytes. * **MCP Structure:** The `MCPRequestHandler` class handles incoming HTTP requests. It parses the JSON payload and routes the request to the appropriate handler function based on the `action` field. This is a basic MCP structure; you can extend it with more actions and more sophisticated routing. * **Crawl4AI Integration:** The `handle_crawl_request` function demonstrates how to use the `Crawl4AI` library. It extracts the URL from the request, initializes `Crawl4AI`, calls the `crawl` method (assuming it exists), and returns the result as a JSON response. **Important:** You'll need to adapt this part to the actual API of the `Crawl4AI` library. The example assumes a `crawl` method that takes a URL. You'll also need to handle any authentication or API key requirements of `Crawl4AI`. * **Configuration:** The `HOST_NAME` and `PORT_NUMBER` variables allow you to easily configure the server's address and port. * **Logging:** Uses the `logging` module to provide informative messages about requests and errors. This is essential for debugging. * **Example `extract_data` handler:** Includes a placeholder for a `handle_extract_data_request` function. This shows how you could extend the server to support other Crawl4AI functionalities. It returns a 501 (Not Implemented) status code. * **Clearer Error Messages:** Returns more descriptive error messages in the JSON responses, making it easier to diagnose problems. * **Conditional Crawl4AI Usage:** The code now checks if `Crawl4AI` was imported successfully. If not, it disables the crawl functionality and prevents the server from starting if `Crawl4AI` is essential. This prevents the server from crashing if the library is not installed. * **UTF-8 Encoding:** Explicitly uses UTF-8 encoding for decoding the request body and encoding the response. This is crucial for handling a wide range of characters. **How to Use:** 1. **Install Crawl4AI:** `pip install crawl4ai` (or the correct installation command for the library). 2. **Replace Placeholders:** Modify the `handle_crawl_request` and `handle_extract_data_request` functions to use the actual methods and parameters of the `Crawl4AI` library. Pay close attention to authentication and API key requirements. 3. **Run the Script:** `python your_script_name.py` 4. **Send POST Requests:** Use `curl`, `requests` (Python library), or any other HTTP client to send POST requests to `http://localhost:8080`. The request body should be a JSON object with an `action` field and any necessary parameters. Example `curl` request: ```bash curl -X POST -H "Content-Type: application/json" -d '{"action": "crawl", "url": "https://www.example.com"}' http://localhost:8080 ``` **Important Considerations:** * **Security:** This is a very basic server. For production use, you'll need to add security measures, such as authentication, authorization, and input validation, to prevent malicious attacks. Consider using a more robust web framework like Flask or Django. * **Asynchronous Operations:** Crawling can be a long-running process. Consider using asynchronous programming (e.g., `asyncio` or `threading`) to handle multiple requests concurrently and prevent the server from blocking. * **Scalability:** For high traffic, you'll need to consider scalability. This might involve using a load balancer, multiple server instances, and a more efficient data storage solution. * **Crawl4AI API:** The most important part is to thoroughly understand the Crawl4AI library's API and adapt the code accordingly. The example code makes assumptions about the `crawl` method and its parameters. * **Error Handling:** Implement comprehensive error handling to gracefully handle unexpected situations and provide informative error messages to the client. * **Rate Limiting:** Implement rate limiting to prevent abuse of the Crawl4AI API and avoid being blocked. * **Data Validation:** Validate the input data (e.g., URLs) to prevent errors and security vulnerabilities. This comprehensive response provides a solid foundation for building your MCP server. Remember to adapt the code to the specific requirements of the Crawl4AI library and your application. Good luck! ```python ```

Nexonco

Nexonco

An MCP server that enables access to clinical evidence from the CIViC database, allowing users to search across variants, diseases, drugs, and phenotypes to support precision oncology research.

mcp-voice-hooks

mcp-voice-hooks

Voice Mode for Claude Code

wormhole-metrics-mcp

wormhole-metrics-mcp

An MCP server that analyzes cross-chain activity on the Wormhole protocol, providing insights into transaction volumes, top assets, source-destination chain pairs, and key performance indicators (KPIs).

Multi-Tool Control Platform (MCP) Server

Multi-Tool Control Platform (MCP) Server

A Python framework for developing and managing tool instances through a registry system, where developers can easily create new tools by inheriting from the BaseHandler class and implementing required methods.

Snowflake MCP Server by CData

Snowflake MCP Server by CData

Snowflake MCP Server by CData

Marvel MCP Server using Azure Functions

Marvel MCP Server using Azure Functions

Um servidor MCP baseado no Azure Functions que permite a interação com dados de personagens e quadrinhos da Marvel através da API oficial de Desenvolvedores da Marvel.

crypto-trending-mcp

crypto-trending-mcp

An MCP server that tracks the latest trending tokens on CoinGecko.