Discover Awesome MCP Servers
Extend your agent with 25,254 capabilities via MCP servers.
- All25,254
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Perplexity MCP Server
Mirror of
MCP Games Server
GitHub MCP Server
mariadb-mcp-server
An mcp server that provides read-only access to MariaDB.
TimezoneToolkit MCP Server
고급 MCP 서버로, 포괄적인 시간 및 시간대 도구를 제공합니다.
AWS Model Context Protocol (MCP) Server
AI 어시스턴트가 모델 컨텍스트 프로토콜(MCP)을 통해 AWS CLI 명령을 실행할 수 있도록 지원하는 경량 서비스입니다. 이를 통해 AI 도구는 AWS 문서를 검색하고 AWS 서비스와 상호 작용할 수 있습니다.
MCP Command History
MCP(모델 제어 프로토콜) 인터페이스를 통해 셸 명령어 기록을 탐색, 검색 및 관리할 수 있는 강력한 도구입니다. 이 프로젝트를 통해 이전에 실행한 셸 명령어에 쉽게 접근하고 검색하고 검색할 수 있습니다.
Gitee MCP Server
Gitee API 연동, 저장소, 이슈, 풀 리퀘스트 관리 등.
Memory Bank MCP Server 2.2.1
글로벌 및 브랜치별 메모리 뱅크를 통해 클로드 AI 세션 전반에 걸쳐 프로젝트 문서 및 컨텍스트를 관리하는 서버입니다. 구조화된 JSON 문서 스토리지를 통해 일관된 지식 관리를 가능하게 합니다.
MCP Server
MCP server implementation for handling run_python requests
MCP Server - Oracle DB Context
대규모 Oracle 데이터베이스 작업을 위한 MCP 서버
MCP Chunk Editor
LLM을 위한 효율적이고 안전한 텍스트 편집기를 제공하는 MCP 서버
kagi-server MCP Server
Mirror of
vigilant-adventure
hey,i wanted to play with some mods but when i try to open the game it says; The game crashed whilst initializing game Error: java.lang.NoClassDefFoundError: cpw/mods/fml/common/IPlayerTracker Exit Code: -1 what can i do to make it work?the log is; ---- Minecraft Crash Report ---- WARNING: coremods are present: MekanismCoremod (Mekanism-1.12.2-9…
Better Qdrant MCP Server
Qdrant 벡터 데이터베이스 컬렉션 관리, 다양한 임베딩 서비스를 사용한 문서 처리 및 임베딩, 벡터 임베딩 전반에 걸친 시맨틱 검색 수행 도구를 제공하여 시맨틱 검색 기능을 지원하는 모델 컨텍스트 프로토콜 서버입니다.
.NET MCP Servers
Collection of my MCP (Model Context Protocol) servers written in .NET
Figma to Vue MCP Server
MCP server that generates Vue components from Figma designs following Hostinger's design system
MCP Image Generation Server
Mirror of
WCGW
Send code snippet and paths to Claude. Designed to work with wcgw mcp server.
artifacts-mcp
MCP Server for Artifacts MMO
Knowledge Graph Memory Server
거울
mcp_server
mem0 MCP Server
Mem0 통합을 통해 메모리 스트림의 생성, 관리 및 시맨틱 검색을 가능하게 하는 Model Context Protocol 서버의 TypeScript 구현.
mcp-server-bluesky
Mirror of
Model Context Protocol (MCP) Server 🚀
mcp-server-web3
The web3 function plugin server base on MCP of Anthropic.
Prompt Decorators
A standardized framework for enhancing how LLMs process and respond to prompts through composable decorators, featuring an official open standard specification and Python reference implementation with MCP server integration.
McpDocs
Provide documentation about your Elixir project's functions and functions of dependencies to an LLM through an SSE MCP server.
mpc-csharp-semantickernel
## Microsoft Semantic Kernel with OpenAI and MCP Server: Example Usage This example demonstrates how to use Microsoft Semantic Kernel (SK) with OpenAI for language model interaction and integrate it with an MCP (Management and Configuration Platform) server for configuration management. This is a conceptual example, as the specific implementation of an MCP server will vary. **Conceptual Overview:** 1. **Configuration:** The MCP server holds configuration data like OpenAI API keys, model names, and other settings. 2. **Initialization:** The Semantic Kernel is initialized, retrieving configuration from the MCP server. 3. **Skill Definition:** Skills are defined using prompts and potentially native code. 4. **Execution:** The Kernel executes skills, leveraging OpenAI for language model interaction. 5. **Dynamic Updates:** The MCP server can update configurations, and the Kernel can be reconfigured dynamically (implementation dependent). **Code Example (Conceptual - Python):** ```python import semantic_kernel as sk from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion, OpenAITextCompletion import requests # For interacting with the MCP server import json # --- 1. Configuration Retrieval from MCP Server --- def get_config_from_mcp(config_url): """Retrieves configuration from the MCP server.""" try: response = requests.get(config_url) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) return response.json() except requests.exceptions.RequestException as e: print(f"Error fetching configuration from MCP: {e}") return None # MCP Server URL (replace with your actual MCP server endpoint) MCP_CONFIG_URL = "http://your-mcp-server/config/sk" config = get_config_from_mcp(MCP_CONFIG_URL) if config is None: print("Failed to retrieve configuration. Exiting.") exit() # Extract configuration values OPENAI_API_KEY = config.get("openai_api_key") OPENAI_ORG_ID = config.get("openai_org_id", None) # Optional OPENAI_CHAT_MODEL = config.get("openai_chat_model", "gpt-3.5-turbo") OPENAI_TEXT_MODEL = config.get("openai_text_model", "text-davinci-003") # Optional # --- 2. Kernel Initialization --- kernel = sk.Kernel() # Add OpenAI connector try: kernel.add_chat_service( "openai-chat", OpenAIChatCompletion(OPENAI_CHAT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) ) # Optionally add a text completion service if needed # kernel.add_text_completion_service( # "openai-text", OpenAITextCompletion(OPENAI_TEXT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) # ) except Exception as e: print(f"Error initializing OpenAI connector: {e}") exit() # --- 3. Skill Definition --- # Define a skill using a prompt template prompt_template = """ You are a helpful assistant. Answer the user's question concisely. User: {{ $input }} Assistant: """ summarize_skill = kernel.create_semantic_function( prompt_template=prompt_template, description="Answers user questions concisely.", max_tokens=200, temperature=0.7, top_p=1, ) # --- 4. Execution --- user_question = "What is the capital of France?" result = await summarize_skill.invoke_async(input=user_question) print(f"User Question: {user_question}") print(f"Answer: {result}") # --- 5. Dynamic Updates (Conceptual) --- # This is a simplified example. A real implementation would likely involve: # - A mechanism for the MCP server to notify the application of configuration changes. # - A function to reload the configuration and re-initialize the Kernel. def reload_config(): """Reloads the configuration from the MCP server and re-initializes the Kernel.""" global kernel, OPENAI_API_KEY, OPENAI_CHAT_MODEL new_config = get_config_from_mcp(MCP_CONFIG_URL) if new_config: new_openai_api_key = new_config.get("openai_api_key") new_openai_chat_model = new_config.get("openai_chat_model", "gpt-3.5-turbo") if new_openai_api_key != OPENAI_API_KEY or new_openai_chat_model != OPENAI_CHAT_MODEL: print("Configuration changed. Re-initializing Kernel.") OPENAI_API_KEY = new_openai_api_key OPENAI_CHAT_MODEL = new_openai_chat_model # Re-initialize the Kernel with the new configuration kernel = sk.Kernel() try: kernel.add_chat_service( "openai-chat", OpenAIChatCompletion(OPENAI_CHAT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) ) print("Kernel re-initialized successfully.") except Exception as e: print(f"Error re-initializing OpenAI connector: {e}") else: print("Failed to retrieve new configuration.") # Example of calling the reload function (in a real application, this would be triggered by an event) # reload_config() ``` **Explanation:** * **MCP Integration:** The code uses the `requests` library to fetch configuration data from a hypothetical MCP server. The `MCP_CONFIG_URL` needs to be replaced with the actual endpoint of your MCP server. The MCP server is expected to return a JSON object containing the necessary configuration parameters. * **Configuration Loading:** The `get_config_from_mcp` function retrieves the configuration and handles potential errors. * **Kernel Initialization:** The `OPENAI_API_KEY` and `OPENAI_CHAT_MODEL` are retrieved from the configuration and used to initialize the Semantic Kernel with the OpenAI connector. * **Skill Definition:** A simple skill is defined using a prompt template. This skill takes user input and uses the OpenAI chat model to provide a concise answer. * **Execution:** The skill is executed with a sample user question. * **Dynamic Updates (Conceptual):** The `reload_config` function demonstrates how the Kernel could be reconfigured dynamically if the configuration changes on the MCP server. This is a simplified example; a real implementation would likely involve a more robust mechanism for detecting and handling configuration changes. This might involve polling the MCP server periodically or using a push-based mechanism (e.g., webhooks). **Key Considerations for MCP Integration:** * **Security:** Securely store and transmit the OpenAI API key. Consider using encryption or other security measures. The MCP server should also be secured to prevent unauthorized access to the configuration data. * **Error Handling:** Implement robust error handling to gracefully handle situations where the MCP server is unavailable or returns invalid configuration data. * **Configuration Format:** Define a clear and consistent format for the configuration data stored on the MCP server. * **Dynamic Updates:** Choose an appropriate mechanism for detecting and handling configuration changes. Consider the performance implications of polling the MCP server frequently. * **Version Control:** Implement version control for the configuration data on the MCP server to allow for easy rollback to previous configurations. * **Authentication/Authorization:** Implement authentication and authorization mechanisms to control access to the MCP server and the configuration data. **Korean Translation of Key Concepts:** * **Microsoft Semantic Kernel:** 마이크로소프트 시맨틱 커널 * **OpenAI:** OpenAI (오픈에이아이) * **MCP Server (Management and Configuration Platform):** MCP 서버 (관리 및 구성 플랫폼) * **Configuration:** 구성 (구성 설정) * **API Key:** API 키 * **Model Name:** 모델 이름 * **Skill:** 스킬 (기능) * **Prompt Template:** 프롬프트 템플릿 * **Dynamic Updates:** 동적 업데이트 * **Kernel Initialization:** 커널 초기화 * **Execution:** 실행 * **Error Handling:** 오류 처리 * **Security:** 보안 * **Authentication:** 인증 * **Authorization:** 권한 부여 This example provides a starting point for integrating Microsoft Semantic Kernel with OpenAI and an MCP server. You will need to adapt the code to your specific requirements and environment. Remember to replace the placeholder values with your actual configuration values and MCP server endpoint.
Linear MCP Server
Mirror of