Discover Awesome MCP Servers
Extend your agent with 26,882 capabilities via MCP servers.
- All26,882
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Creating an MCP Server in Go and Serving it with Docker (part 2)
CF-MCP-Server
Database MCP Server
Model Context Protocol (MCP) サーバーは、統一されたインターフェースを通じて、様々なデータベースシステム(SQLite、PostgreSQL、MySQL/MariaDB、SQL Server)への接続と操作を行うためのツールを提供するものです。
MCP (Model Context Protocol) Research
「Model Context Protocol (MCP) サーバーとその実装に関する調査とドキュメント」
MCP Tools
鏡 (Kagami)
MCP Server: VS Code Extensions Installer
CursorにVS Code拡張機能を自動的にインストールするためのMCPツール
LibreChat MCP Servers
Okay, here are instructions for setting up SuperGateway MCP (Multi-Chain Protocol) servers in Docker containers for Docker deployments of LibreChat. I'll provide a comprehensive guide, covering prerequisites, Dockerfile creation, Docker Compose configuration, and important considerations. **Important Considerations Before You Begin:** * **Complexity:** Setting up SuperGateway MCP servers adds complexity to your LibreChat deployment. Ensure you understand the implications and have a good grasp of Docker and networking concepts. * **Resource Requirements:** Each MCP server will consume resources (CPU, memory, network). Plan your server resources accordingly. * **Security:** Pay close attention to security. Expose only necessary ports and implement appropriate authentication and authorization mechanisms. * **API Keys:** You will need API keys for each of the models you want to use with SuperGateway. * **SuperGateway Documentation:** Refer to the official SuperGateway documentation for the most up-to-date information and specific configuration options. My instructions are a general guide, but the official documentation is the definitive source. **Step 1: Prerequisites** * **Docker:** Ensure you have Docker installed and running on your system. * **Docker Compose:** Docker Compose is essential for managing multi-container applications. Install it if you haven't already. * **LibreChat Docker Deployment:** You should already have a working LibreChat Docker deployment. These instructions assume you're adding SuperGateway to an existing setup. * **SuperGateway MCP Server Binaries:** You'll need the SuperGateway MCP server binaries. You can typically obtain these from the SuperGateway project's releases or build them from source. Download the appropriate binaries for your system architecture. * **API Keys:** Obtain API keys for the models you intend to use with SuperGateway. **Step 2: Create Dockerfiles for Each MCP Server** You'll need a separate Dockerfile for each MCP server you want to run. Here's a template Dockerfile, which you'll need to adapt for each specific MCP server: ```dockerfile # Use a base image that suits your needs. Alpine is lightweight. FROM alpine:latest # Update the package index and install necessary dependencies. RUN apk update && apk add --no-cache bash curl jq # Create a directory for the MCP server. RUN mkdir -p /app # Copy the MCP server binary to the container. Replace 'mcp_server' with the actual filename. COPY mcp_server /app/mcp_server # Copy the configuration file. Replace 'config.json' with your actual config file. COPY config.json /app/config.json # Make the binary executable. RUN chmod +x /app/mcp_server # Expose the port the MCP server will listen on. Adjust as needed. EXPOSE 8000 # Set the working directory. WORKDIR /app # Command to run the MCP server. Adjust the command-line arguments as needed. CMD ["./mcp_server", "--config", "config.json"] ``` **Explanation:** * `FROM alpine:latest`: Uses a lightweight Alpine Linux base image. You can choose a different base image if you prefer (e.g., `ubuntu:latest`). * `RUN apk update && apk add --no-cache bash curl jq`: Updates the package index and installs `bash`, `curl`, and `jq`. These are common utilities that might be useful for debugging or interacting with the server. Adjust the packages as needed. * `RUN mkdir -p /app`: Creates a directory inside the container to hold the MCP server files. * `COPY mcp_server /app/mcp_server`: Copies the MCP server binary from your local machine to the `/app` directory in the container. **Replace `mcp_server` with the actual filename of your MCP server binary.** * `COPY config.json /app/config.json`: Copies the MCP server configuration file from your local machine to the `/app` directory in the container. **Replace `config.json` with the actual filename of your configuration file.** * `RUN chmod +x /app/mcp_server`: Makes the MCP server binary executable. * `EXPOSE 8000`: Exposes port 8000. **Change this to the port your MCP server is configured to listen on.** * `WORKDIR /app`: Sets the working directory inside the container to `/app`. * `CMD ["./mcp_server", "--config", "config.json"]`: Defines the command to run when the container starts. **Adjust the command-line arguments to match your MCP server's requirements.** The `--config` flag is a common way to specify the configuration file. **Example: Dockerfile for an OpenAI MCP Server** Let's say you have an MCP server specifically for OpenAI models. Your Dockerfile might look like this: ```dockerfile FROM alpine:latest RUN apk update && apk add --no-cache bash curl jq RUN mkdir -p /app COPY openai_mcp_server /app/openai_mcp_server COPY openai_config.json /app/openai_config.json RUN chmod +x /app/openai_mcp_server EXPOSE 8001 # Assuming OpenAI MCP server listens on port 8001 WORKDIR /app CMD ["./openai_mcp_server", "--config", "openai_config.json"] ``` **Important:** * **Create a separate directory for each MCP server's files.** This will keep your project organized. For example: * `openai_mcp/Dockerfile` * `openai_mcp/openai_mcp_server` * `openai_mcp/openai_config.json` * `anthropic_mcp/Dockerfile` * `anthropic_mcp/anthropic_mcp_server` * `anthropic_mcp/anthropic_config.json` * And so on... * **Customize the `config.json` file for each MCP server.** This file will contain the API keys, model configurations, and other settings specific to that server. Refer to the SuperGateway documentation for the correct format. **Step 3: Create a Docker Compose File** Now, create a `docker-compose.yml` file to define your services. This file will orchestrate the creation and linking of your LibreChat container and your MCP server containers. ```yaml version: "3.9" services: librechat: # Your existing LibreChat service definition. This is just an example. image: ghcr.io/danny-dann/librechat:latest ports: - "3080:3080" environment: - VITE_APP_API_URL=http://localhost:3080 - OPENAI_API_KEY=${OPENAI_API_KEY} # If you're still using OpenAI directly # ... other LibreChat environment variables ... volumes: - librechat_data:/data depends_on: - mongodb mongodb: # Your existing MongoDB service definition. This is just an example. image: mongo:latest ports: - "27017:27017" volumes: - mongodb_data:/data/db openai_mcp: build: ./openai_mcp ports: - "8001:8001" environment: - OPENAI_API_KEY=${OPENAI_API_KEY} # Pass the API key to the container depends_on: - librechat # Ensure LibreChat starts before the MCP server anthropic_mcp: build: ./anthropic_mcp ports: - "8002:8002" environment: - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} # Pass the API key to the container depends_on: - librechat # Ensure LibreChat starts before the MCP server # Add more MCP server definitions here as needed. volumes: librechat_data: mongodb_data: ``` **Explanation:** * `version: "3.9"`: Specifies the Docker Compose file version. * `services:`: Defines the services that make up your application. * `librechat:`: Your existing LibreChat service definition. **Keep this as it is.** I've included a basic example, but use your actual LibreChat configuration. * `mongodb:`: Your existing MongoDB service definition. **Keep this as it is.** I've included a basic example, but use your actual MongoDB configuration. * `openai_mcp:`: Defines the OpenAI MCP server service. * `build: ./openai_mcp`: Tells Docker Compose to build the image from the `Dockerfile` located in the `./openai_mcp` directory. * `ports: - "8001:8001"`: Maps port 8001 on the host machine to port 8001 in the container. **Adjust this to match the port your MCP server is listening on.** * `environment: - OPENAI_API_KEY=${OPENAI_API_KEY}`: Passes the `OPENAI_API_KEY` environment variable to the container. **This is crucial for providing the MCP server with the necessary API key.** You'll need to set this environment variable on your host machine or in a `.env` file. * `depends_on: - librechat`: Ensures that the `librechat` service starts before the `openai_mcp` service. This is important because LibreChat might need to be running before the MCP server can connect to it. * `anthropic_mcp:`: Defines the Anthropic MCP server service. This is similar to the `openai_mcp` service, but with different settings. * `volumes:`: Defines the volumes used by the services. **Keep your existing volumes for LibreChat and MongoDB.** **Important:** * **Replace the example LibreChat and MongoDB service definitions with your actual configurations.** * **Add more MCP server service definitions as needed.** For each MCP server, create a separate service definition with the appropriate `build`, `ports`, `environment`, and `depends_on` settings. * **Set the environment variables for your API keys.** You can do this in a `.env` file in the same directory as your `docker-compose.yml` file. For example: ``` OPENAI_API_KEY=sk-your-openai-api-key ANTHROPIC_API_KEY=your-anthropic-api-key ``` Then, Docker Compose will automatically load these environment variables when you run `docker-compose up`. **Do not commit your `.env` file to version control!** * **Adjust the `depends_on` settings as needed.** If one MCP server depends on another, make sure to specify the dependency in the `depends_on` section. **Step 4: Configure LibreChat to Use SuperGateway** Now you need to configure LibreChat to use the SuperGateway MCP servers. This typically involves setting the `API_URL` or similar configuration option in LibreChat to point to the SuperGateway endpoint. **This is the most crucial step and depends heavily on how SuperGateway is integrated with LibreChat.** Consult the SuperGateway and LibreChat documentation for the specific configuration options. **Example (Conceptual):** Let's say SuperGateway exposes an endpoint like `http://localhost:8000/chat`. You might need to set the `VITE_APP_API_URL` environment variable in your LibreChat service definition to: ```yaml librechat: # ... environment: - VITE_APP_API_URL=http://localhost:8000/chat # Example! Adjust as needed. # ... ``` **Important:** * **The `API_URL` or equivalent setting in LibreChat must point to the correct SuperGateway endpoint.** This is how LibreChat will know to send requests to SuperGateway instead of directly to the model providers. * **You might need to configure SuperGateway to route requests to the appropriate MCP servers based on the model being used.** This is typically done in the SuperGateway configuration file. * **Test your configuration thoroughly.** Make sure that LibreChat is correctly sending requests to SuperGateway and that SuperGateway is correctly routing requests to the MCP servers. **Step 5: Build and Run the Docker Compose Project** Finally, build and run the Docker Compose project: ```bash docker-compose up --build ``` This command will: 1. Build the Docker images for each service (including the MCP servers). 2. Create and start the containers. **Step 6: Verify and Test** * **Check the logs:** Use `docker-compose logs -f` to view the logs of each service. Look for any errors or warnings. * **Test LibreChat:** Access LibreChat in your browser and try sending requests to different models. Verify that the requests are being routed through SuperGateway and that the responses are correct. * **Monitor resource usage:** Use `docker stats` to monitor the resource usage of the containers. Make sure that the MCP servers are not consuming excessive resources. **Troubleshooting** * **Container fails to start:** Check the logs for errors. Common causes include incorrect configuration, missing dependencies, or port conflicts. * **LibreChat cannot connect to SuperGateway:** Verify that the `API_URL` or equivalent setting in LibreChat is correct. Also, check the network connectivity between the LibreChat container and the SuperGateway container. * **SuperGateway cannot connect to the MCP servers:** Verify that the MCP server addresses and ports are correct in the SuperGateway configuration file. Also, check the network connectivity between the SuperGateway container and the MCP server containers. * **API key errors:** Make sure that you have provided the correct API keys to the MCP servers. Also, check that the API keys are valid and have sufficient permissions. **Summary of Key Points (Japanese Translation):** * **複雑性:** SuperGateway MCPサーバーのセットアップは複雑さを増します。 Dockerとネットワークの概念を理解していることを確認してください。 *(Fukuzatsu-sei: SuperGateway MCP sābā no settoappu wa fukuzatsu-sa o mashimasu. Docker to nettowāku no gainen o rikai shite iru koto o kakunin shite kudasai.)* * **リソース要件:** 各MCPサーバーはリソースを消費します。 サーバーのリソースを適切に計画してください。 *(Risōsu yōken: Kaku MCP sābā wa risōsu o shōhi shimasu. Sābā no risōsu o tekisetsu ni keikaku shite kudasai.)* * **セキュリティ:** セキュリティに注意してください。 必要なポートのみを公開し、適切な認証および認可メカニズムを実装してください。 *(Sekyuriti: Sekyuriti ni chūi shite kudasai. Hitsuyō na pōto nomi o kōkai shi, tekisetsu na ninshō oyobi ninka mekanizumu o jissō shite kudasai.)* * **APIキー:** SuperGatewayで使用するモデルごとにAPIキーが必要です。 *(API kī: SuperGateway de shiyō suru moderu goto ni API kī ga hitsuyō desu.)* * **SuperGatewayドキュメント:** 最新の情報と特定の設定オプションについては、公式ドキュメントを参照してください。 *(SuperGateway dokyumento: Saishin no jōhō to tokutei no settei opushon ni tsuite wa, kōshiki dokyumento o sanshō shite kudasai.)* * **各MCPサーバーに個別のディレクトリを作成します。** *(Kaku MCP sābā ni kobetsu no direkutori o sakusei shimasu.)* * **`.env`ファイルをバージョン管理にコミットしないでください!** *(`.env` fairu o bājon kanri ni komitto shinaide kudasai!)* * **LibreChatの`API_URL`設定がSuperGatewayのエンドポイントを指していることを確認してください。** *(LibreChat no `API_URL` settei ga SuperGateway no endopointo o sashite iru koto o kakunin shite kudasai.)* * **設定を徹底的にテストしてください。** *(Settei o tetteiteki ni tesuto shite kudasai.)* This is a complex setup, so be prepared to troubleshoot and consult the documentation for both LibreChat and SuperGateway. Good luck!
Gemini Flash MCP - Image Generation for Roo Code
Google Gemini 2.0 Flash画像生成用MCPサーバー
Venice AI Image Generator MCP Server
MCPサーバーの機能(ベニスとジェミニ(画像))のテスト
Limitless MCP Integration
Limitless API のためのモデルコンテキストプロトコルサーバー、クライアント、およびインタラクティブモード
Template Redmine Plugin
Goose FM
承知いたしました。「AIアシスタントがFMラジオ局にチューニングできるMCPサーバーのMVP」を日本語に翻訳します。 **翻訳:** AIアシスタントがFMラジオ局にチューニングできるMCPサーバーのMVP (Minimum Viable Product) **解説:** * **MVP (Minimum Viable Product)** は、そのまま「MVP」と表記されることが多いです。必要最低限の機能を持つ製品という意味合いです。 * **MCP server** は、文脈によって意味合いが変わりますが、ここでは「メディアコントロールプロトコル (Media Control Protocol) サーバー」と解釈し、そのまま「MCPサーバー」と表記します。 * 全体として、AIアシスタントがFMラジオ局にアクセスするための、必要最低限の機能を持つMCPサーバーを指しています。 より自然な日本語にする場合は、以下のような表現も考えられます。 * AIアシスタントがFMラジオを聴取できる、最小限の機能を持つMCPサーバー * AIアシスタント向けFMラジオチューニング機能付きMCPサーバーのMVP どちらの表現が適切かは、文脈によって判断してください。
Fillout.io MCP Server
Fillout.io API を使用して、フォーム管理、回答処理、および分析を可能にし、フォームのインタラクションとインサイトを強化します。
Pandora's Shell
MCPサーバーで、AIアシスタントがセキュアシェルインターフェースを通じてターミナルコマンドを実行できるようにする。
Deno 2 Playwright Model Context Protocol Server Example
鏡 (Kagami)
Memory MCP Server
鏡 (Kagami)
MCP GitHub
GitHub MCPサーバーを使用してリポジトリが作成されました。
Google Analytics MCP Server
鏡 (Kagami)
MCP Server
(STDIO) ローカル実行用に設計されたモデルコンテキストプロトコル (MCP) サーバー
Data BI MCP Server
データ変換とBIチャートのためのMCP(モデルコンテキストプロトコル)サーバーがあれば、AIアシスタントは自然言語のリクエストを通じて、あなたのデータソースに接続し、データを変換し、高品質な可視化を生成できるようになります。
Bishop MCP (Master Control Program)
これは私が開発した高度な MCP サーバーのスクリプトで、共有したいと思っています。
Rootly MCP Server
鏡 (Kagami)
MCP Spotify Server
WIP: MCP Server Superset
大規模言語モデルがREST APIを通じてApache Supersetデータベースとやり取りできるようにするモデルコンテキストプロトコルサーバー。データベースクエリ、テーブルルックアップ、フィールド情報取得、SQL実行をサポートします。
Microsoft SQL Server MCP Server
鏡 (Kagami)
postgres-mcp MCP server
Postgres Proは、オープンソースのModel Context Protocol(MCP)サーバーであり、初期コーディングからテスト、デプロイ、そして本番環境でのチューニングやメンテナンスまで、開発プロセス全体を通してあなたとあなたのAIエージェントをサポートするために構築されています。
What is Model Context Protocol (MCP)?
軽量なモデルコンテキストプロトコル(MCP)サーバー。LLMがメールアドレスを検証できるようにします。このツールは、AbstractAPI Email Validation APIを使用して、メールの形式、ドメインの有効性、および配信可能性をチェックします。Claude DesktopのようなAIアプリケーションにメール検証を統合するのに最適です。
My Slack MCP Server Extension
Slack の MCP サーバーの拡張機能
Waldzell MCP Servers
Waldzell AIのMCPサーバーのモノレポ。Claude Desktop、Cline、Roo Codeなどで使用!
Cortellis MCP Server
MCPサーバーは、AIアシスタントがCortellisを通じて医薬品データを検索・分析できるようにするものです。包括的な医薬品検索とオントロジー探索機能を備えています。