Discover Awesome MCP Servers

Extend your agent with 16,263 capabilities via MCP servers.

All16,263
n8n - Secure Workflow Automation for Technical Teams

n8n - Secure Workflow Automation for Technical Teams

Platform otomatisasi alur kerja kode-adil dengan kemampuan AI bawaan. Gabungkan pembangunan visual dengan kode khusus, host sendiri atau cloud, 400+ integrasi.

OptionsFlow

OptionsFlow

Server Protokol Konteks Model yang memungkinkan LLM untuk menganalisis rantai opsi, menghitung Greeks, dan mengevaluasi strategi opsi dasar melalui data Yahoo Finance.

Jira Prompts MCP Server

Jira Prompts MCP Server

Sebuah server MCP yang menawarkan beberapa perintah untuk menghasilkan prompt atau konteks dari konten Jira.

NotePlan MCP Server

NotePlan MCP Server

A Message Control Protocol server that enables Claude Desktop to interact with NotePlan.co, allowing users to query, search, create, and update notes directly from Claude conversations.

Knowledge Graph Memory Server

Knowledge Graph Memory Server

Implementasi memori persisten yang ditingkatkan menggunakan grafik pengetahuan lokal dengan `--memory-path` yang dapat disesuaikan. Ini memungkinkan Claude mengingat informasi tentang pengguna di berbagai percakapan.

MCP Servers

MCP Servers

Kumpulan server MCP (Model Context Protocol) sebagai alat dotnet.

HashiCorp Vault MCP Server

HashiCorp Vault MCP Server

Enables interaction with HashiCorp Vault for secret management operations including reading, writing, listing, and deleting secrets through the Model Context Protocol.

面试鸭 MCP Server

面试鸭 MCP Server

Layanan MCP Server untuk pertanyaan pencarian Interview Duck berdasarkan Spring AI, dengan cepat memungkinkan AI mencari pertanyaan dan jawaban wawancara perusahaan yang sebenarnya.

Zen MCP Server

Zen MCP Server

Orchestrates multiple AI models (Gemini, OpenAI, Claude, local models) within a single conversation context, enabling collaborative workflows like multi-model code reviews, consensus building, and CLI-to-CLI bridging for specialized tasks.

Aws Sample Gen Ai Mcp Server

Aws Sample Gen Ai Mcp Server

Okay, here's a sample code snippet demonstrating how to use Gen-AI (specifically, Bedrock) with an MCP (Message Communication Protocol) server in Python. This example assumes you have: 1. **AWS Credentials Configured:** Your AWS credentials (access key, secret key, region) are properly configured, either through environment variables, an IAM role, or the AWS CLI. 2. **Bedrock Access:** You have access to the Bedrock service and the specific model you want to use (e.g., Anthropic Claude, AI21 Labs Jurassic-2). 3. **MCP Server Running:** You have an MCP server running and listening for connections. The example uses a simple TCP socket for the MCP communication. You'll need to adapt the MCP server part to your specific MCP implementation. 4. **Libraries Installed:** You have the necessary libraries installed: `boto3`, `json`. ```python import boto3 import json import socket # Configuration BEDROCK_REGION = "us-east-1" # Replace with your Bedrock region MODEL_ID = "anthropic.claude-v2" # Replace with your desired Bedrock model ID MCP_SERVER_HOST = "localhost" # Replace with your MCP server host MCP_SERVER_PORT = 12345 # Replace with your MCP server port # Initialize Bedrock client bedrock = boto3.client(service_name="bedrock-runtime", region_name=BEDROCK_REGION) def generate_text_with_bedrock(prompt): """ Generates text using the Bedrock service. Args: prompt (str): The prompt to send to the model. Returns: str: The generated text, or None if there was an error. """ try: # Construct the request body based on the model if "anthropic" in MODEL_ID: body = json.dumps({ "prompt": f"\n\nHuman: {prompt}\n\nAssistant:", "max_tokens_to_sample": 200, # Adjust as needed "temperature": 0.5, # Adjust as needed "top_p": 0.9, # Adjust as needed }) content_type = "application/json" accept = "application/json" elif "ai21" in MODEL_ID: body = json.dumps({ "prompt": prompt, "maxTokens": 200, "temperature": 0.7, "topP": 1, "stopSequences": [] }) content_type = "application/json" accept = "application/json" else: print(f"Unsupported model ID: {MODEL_ID}") return None response = bedrock.invoke_model( modelId=MODEL_ID, contentType=content_type, accept=accept, body=body ) response_body = json.loads(response["body"].read().decode("utf-8")) # Extract the generated text based on the model if "anthropic" in MODEL_ID: generated_text = response_body["completion"] elif "ai21" in MODEL_ID: generated_text = response_body["completions"][0]["data"]["text"] else: return None return generated_text.strip() except Exception as e: print(f"Error generating text: {e}") return None def handle_mcp_request(client_socket): """ Handles a request received from the MCP server. Args: client_socket (socket): The socket connected to the client. """ try: # Receive data from the MCP server request_data = client_socket.recv(1024).decode("utf-8") # Adjust buffer size as needed if not request_data: print("No data received from MCP server.") return print(f"Received from MCP server: {request_data}") # Assuming the MCP request is a JSON string containing a "prompt" field try: request_json = json.loads(request_data) prompt = request_json.get("prompt") if not prompt: response_message = json.dumps({"error": "Missing 'prompt' field in request."}) client_socket.sendall(response_message.encode("utf-8")) return except json.JSONDecodeError: response_message = json.dumps({"error": "Invalid JSON format in request."}) client_socket.sendall(response_message.encode("utf-8")) return # Generate text using Bedrock generated_text = generate_text_with_bedrock(prompt) if generated_text: # Construct the response message response_message = json.dumps({"response": generated_text}) else: response_message = json.dumps({"error": "Failed to generate text."}) # Send the response back to the MCP server client_socket.sendall(response_message.encode("utf-8")) print(f"Sent to MCP server: {response_message}") except Exception as e: print(f"Error handling MCP request: {e}") error_message = json.dumps({"error": str(e)}) client_socket.sendall(error_message.encode("utf-8")) finally: client_socket.close() def main(): """ Main function to listen for connections from the MCP server. """ server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) server_socket.bind((MCP_SERVER_HOST, MCP_SERVER_PORT)) server_socket.listen(1) # Listen for only one connection at a time (for simplicity) print(f"Listening for MCP server connections on {MCP_SERVER_HOST}:{MCP_SERVER_PORT}") while True: try: client_socket, address = server_socket.accept() print(f"Accepted connection from {address}") handle_mcp_request(client_socket) except KeyboardInterrupt: print("Shutting down server.") break except Exception as e: print(f"Error in main loop: {e}") server_socket.close() if __name__ == "__main__": main() ``` **Explanation:** 1. **Imports:** Imports necessary libraries: `boto3` for Bedrock, `json` for handling JSON data, and `socket` for MCP communication. 2. **Configuration:** Sets configuration variables for the Bedrock region, model ID, MCP server host, and port. **Important:** Replace these with your actual values. 3. **`generate_text_with_bedrock(prompt)`:** * Takes a `prompt` as input. * Constructs the request body for the Bedrock `invoke_model` API call. **Crucially, the format of the request body depends on the specific Bedrock model you're using.** The example shows how to format the request for Anthropic Claude and AI21 Labs Jurassic-2. Refer to the Bedrock documentation for the correct format for other models. * Calls `bedrock.invoke_model()` to send the request to Bedrock. * Parses the response from Bedrock and extracts the generated text. **Again, the format of the response depends on the model.** * Handles potential errors and returns the generated text or `None` if there was an error. 4. **`handle_mcp_request(client_socket)`:** * Receives data from the MCP server using `client_socket.recv()`. * Assumes the data is a JSON string containing a `prompt` field. Parses the JSON and extracts the prompt. Handles potential JSON decoding errors. * Calls `generate_text_with_bedrock()` to generate text using Bedrock. * Constructs a JSON response message containing either the generated text or an error message. * Sends the response back to the MCP server using `client_socket.sendall()`. * Closes the client socket. 5. **`main()`:** * Creates a TCP socket and binds it to the specified host and port. * Listens for incoming connections from the MCP server. * When a connection is accepted, calls `handle_mcp_request()` to handle the request. * Handles `KeyboardInterrupt` to allow the server to be shut down gracefully. **How to Use:** 1. **Install Libraries:** ```bash pip install boto3 ``` 2. **Configure AWS Credentials:** Make sure your AWS credentials are set up correctly. 3. **Start the MCP Server:** Start your MCP server and make sure it's listening on the specified host and port. The MCP server needs to be able to send JSON requests to this Python script. 4. **Run the Python Script:** Run the Python script. It will listen for connections from the MCP server. 5. **Send Requests from the MCP Server:** Send JSON requests to the Python script from your MCP server. The requests should have the following format: ```json { "prompt": "Write a short story about a cat who goes on an adventure." } ``` **Important Considerations:** * **Error Handling:** The code includes basic error handling, but you should add more robust error handling for production use. Consider logging errors to a file or using a more sophisticated error reporting system. * **Security:** If your MCP server is exposed to the internet, you need to implement proper security measures to protect it from unauthorized access. * **Scalability:** This example is a simple single-threaded server. For production use, you'll likely need to use a multi-threaded or asynchronous server to handle multiple requests concurrently. Consider using a framework like `asyncio` or `threading` for this. * **MCP Protocol:** This example assumes a very simple MCP protocol where the request is a JSON string. You'll need to adapt the code to your specific MCP protocol. * **Bedrock Model Parameters:** Experiment with the parameters in the `generate_text_with_bedrock` function (e.g., `max_tokens_to_sample`, `temperature`, `top_p`) to get the desired results from the Bedrock model. Refer to the Bedrock documentation for details on these parameters. * **Model-Specific Code:** The code includes model-specific logic for Anthropic Claude and AI21 Labs Jurassic-2. You'll need to add similar logic for other Bedrock models you want to use. **Always consult the Bedrock documentation for the specific model you're using.** * **Rate Limiting:** Be aware of the rate limits for the Bedrock service. You may need to implement rate limiting in your code to avoid exceeding the limits. * **Cost:** Using Bedrock incurs costs. Be sure to monitor your AWS usage and costs. This example provides a starting point for integrating Gen-AI (Bedrock) with an MCP server. You'll need to adapt it to your specific requirements and environment.

PDFSizeAnalyzer-MCP

PDFSizeAnalyzer-MCP

Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.

database-updater MCP Server

database-updater MCP Server

Cermin dari

mcp-workflowy

mcp-workflowy

mcp-workflowy

Berghain Events MCP Server

Berghain Events MCP Server

A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.

Shopify MCP Server by CData

Shopify MCP Server by CData

Shopify MCP Server by CData

MCP Unity Bridge Asset

MCP Unity Bridge Asset

Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs

ncbi-mcp

ncbi-mcp

Pusat Nasional Informasi Biologi NIH's (National Center of Biology Information) Server MCP

url-download-mcp

url-download-mcp

A Model Context Protocol (MCP) server that enables AI assistants to download files from URLs to the local filesystem.

macOS Tools MCP Server

macOS Tools MCP Server

Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.

Datastream MCP Server

Datastream MCP Server

A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.

AWS Amplify Gen 2 Documentation MCP Server

AWS Amplify Gen 2 Documentation MCP Server

This MCP server provides tools to access AWS Amplify Gen 2 documentation and search for content. (not official)

Advanced MCP Server

Advanced MCP Server

A comprehensive Model Context Protocol server providing capabilities for web scraping, data analysis, system monitoring, file operations, API integrations, and report generation.

MCP Trino Server

MCP Trino Server

A Model Context Protocol server that provides seamless integration with Trino and Iceberg, enabling data exploration, querying, and table maintenance through a standard interface.

BolideAI MCP

BolideAI MCP

A comprehensive ModelContextProtocol server that provides AI-powered tools for marketing automation, content generation, research, and project management, integrating with various AI services to streamline workflows for developers and marketers.

Business Central MCP Server

Business Central MCP Server

Server MCP ringan untuk integrasi tanpa hambatan dengan Microsoft Dynamics 365 Business Central.

Google Search MCP Server

Google Search MCP Server

Implementasi server MCP yang terintegrasi dengan Google Custom Search JSON API, menyediakan kemampuan pencarian web.

Pytest MCP Server

Pytest MCP Server

Enables AI assistants to run and analyze pytest tests for desktop applications through interactive commands. Supports test execution, filtering, result analysis, and debugging for comprehensive test automation workflows.

flutterclimcp

flutterclimcp

Tentu, berikut adalah contoh proyek menyenangkan untuk membuat proyek Flutter menggunakan Flutter CLI MCP (Model Context Protocol) Server: **Judul Proyek:** **"Koleksi Meme Lucu"** **Deskripsi:** Aplikasi ini akan menampilkan koleksi meme lucu yang diambil dari server MCP. Pengguna dapat menjelajahi meme, menyukai meme, dan bahkan mengunggah meme mereka sendiri (opsional). **Fitur:** * **Tampilan Daftar Meme:** Menampilkan daftar meme dengan gambar, judul, dan jumlah suka. * **Detail Meme:** Menampilkan meme dalam ukuran penuh dengan deskripsi dan tombol suka. * **Fungsi Suka:** Pengguna dapat menyukai meme. * **Unggah Meme (Opsional):** Pengguna dapat mengunggah meme mereka sendiri ke server. * **Pencarian Meme (Opsional):** Pengguna dapat mencari meme berdasarkan kata kunci. **Teknologi:** * **Flutter:** Untuk antarmuka pengguna (UI) dan logika aplikasi. * **Flutter CLI:** Untuk membuat dan mengelola proyek Flutter. * **MCP Server:** Untuk menyimpan dan menyediakan data meme (gambar, judul, deskripsi, jumlah suka). Anda dapat menggunakan server MCP yang sudah ada atau membuat server MCP sederhana sendiri menggunakan Node.js, Python, atau bahasa lain yang Anda kuasai. * **HTTP Client (misalnya `http` package di Flutter):** Untuk berkomunikasi dengan server MCP. **Langkah-langkah:** 1. **Siapkan Server MCP:** * Buat server MCP sederhana yang dapat menyimpan dan menyediakan data meme. * Server harus memiliki endpoint untuk: * Mendapatkan daftar meme. * Mendapatkan detail meme berdasarkan ID. * Menambah jumlah suka pada meme. * (Opsional) Menerima unggahan meme baru. * Pastikan server dapat merespon dengan format JSON. 2. **Buat Proyek Flutter:** * Buka terminal/command prompt. * Jalankan perintah: `flutter create meme_collection` * Masuk ke direktori proyek: `cd meme_collection` 3. **Implementasikan UI:** * Buat UI untuk menampilkan daftar meme. Gunakan `ListView.builder` untuk menampilkan daftar meme dari data yang diambil dari server MCP. * Buat UI untuk menampilkan detail meme. * Tambahkan tombol suka pada detail meme. 4. **Implementasikan Logika Aplikasi:** * Gunakan `http` package untuk berkomunikasi dengan server MCP. * Buat fungsi untuk: * Mengambil daftar meme dari server. * Mengambil detail meme berdasarkan ID. * Mengirim permintaan suka ke server. * (Opsional) Mengirim permintaan unggah meme ke server. * Panggil fungsi-fungsi ini dari UI untuk menampilkan data dan menangani interaksi pengguna. 5. **Uji dan Debug:** * Jalankan aplikasi di emulator atau perangkat fisik. * Uji semua fitur dan pastikan semuanya berfungsi dengan benar. * Debug jika ada kesalahan. **Tips Tambahan:** * **Gunakan State Management:** Pertimbangkan untuk menggunakan state management seperti Provider, BLoC, atau Riverpod untuk mengelola state aplikasi dengan lebih baik. * **Tambahkan Animasi:** Tambahkan animasi untuk membuat aplikasi lebih menarik. * **Gunakan Library UI:** Gunakan library UI seperti `flutter_staggered_grid_view` untuk membuat tampilan daftar meme yang lebih menarik. * **Fokus pada Desain:** Buat desain UI yang menarik dan mudah digunakan. **Mengapa Proyek Ini Menyenangkan?** * **Relevan:** Meme adalah bagian dari budaya internet yang populer. * **Kreatif:** Anda dapat berkreasi dengan desain UI dan fitur aplikasi. * **Praktis:** Anda akan belajar cara berkomunikasi dengan server MCP dan mengelola data dari server. * **Dapat Dikembangkan:** Anda dapat menambahkan fitur-fitur baru seperti pencarian meme, kategori meme, atau integrasi dengan media sosial. Proyek ini adalah contoh yang bagus untuk belajar cara menggunakan Flutter CLI dan berinteraksi dengan server MCP. Selamat mencoba!

Trello MCP Server

Trello MCP Server

Enables AI assistants to retrieve Trello card information by ID or link, providing access to card details including labels, members, due dates, and attachments through a standardized interface.

mcp-lucene-server

mcp-lucene-server

mcp-lucene-server