Discover Awesome MCP Servers

Extend your agent with 16,638 capabilities via MCP servers.

All16,638
SkySQL MCP Integration

SkySQL MCP Integration

mcp-cbs-cijfers-open-data

mcp-cbs-cijfers-open-data

CBS Cijfers Open Data를 사용하기 위한 MCP 서버

Prompt Decorators

Prompt Decorators

A standardized framework for enhancing how LLMs process and respond to prompts through composable decorators, featuring an official open standard specification and Python reference implementation with MCP server integration.

Model Context Protocol (MCP) Server 🚀

Model Context Protocol (MCP) Server 🚀

mcp-server-bluesky

mcp-server-bluesky

Mirror of

artifacts-mcp

artifacts-mcp

MCP Server for Artifacts MMO

Knowledge Graph Memory Server

Knowledge Graph Memory Server

거울

MCP Server Docker

MCP Server Docker

Docker용 MCP 서버

Weather MCP Server

Weather MCP Server

Flights Mcp Server

Flights Mcp Server

Google Flights용 MCP 서버!!

gatherings MCP Server

gatherings MCP Server

친구 간의 비용 정산을 간편하게 해주는 소셜 이벤트 비용 추적 및 상환 계산을 돕는 모델 컨텍스트 프로토콜 서버.

Linear MCP Server

Linear MCP Server

Mirror of

testmcpgithubdemo1

testmcpgithubdemo1

created from MCP server demo

Simple Memory Extension MCP Server

Simple Memory Extension MCP Server

AI 에이전트의 컨텍스트 창을 확장하는 MCP 서버는 메모리를 저장, 검색 및 검색하는 도구를 제공하여 에이전트가 긴 상호 작용에 걸쳐 기록과 컨텍스트를 유지할 수 있도록 합니다.

Telegram MCP Server

Telegram MCP Server

MCP server to send notifications to Telegram

ChatGPT MCP Server

ChatGPT MCP Server

Mirror of

Apache Doris MCP Server

Apache Doris MCP Server

An MCP server for Apache Doris & VeloDB

Choose MCP Server Setup

Choose MCP Server Setup

거울

mock-assistant-mcp-server

mock-assistant-mcp-server

MCP 서버용 모의 데이터 어시스턴트

McpDocs

McpDocs

Provide documentation about your Elixir project's functions and functions of dependencies to an LLM through an SSE MCP server.

MCP System Monitor

MCP System Monitor

A system monitoring tool that exposes system metrics via the Model Context Protocol (MCP). This tool allows LLMs to retrieve real-time system information through an MCP-compatible interface.

mpc-csharp-semantickernel

mpc-csharp-semantickernel

## Microsoft Semantic Kernel with OpenAI and MCP Server: Example Usage This example demonstrates how to use Microsoft Semantic Kernel (SK) with OpenAI for language model interaction and integrate it with an MCP (Management and Configuration Platform) server for configuration management. This is a conceptual example, as the specific implementation of an MCP server will vary. **Conceptual Overview:** 1. **Configuration:** The MCP server holds configuration data like OpenAI API keys, model names, and other settings. 2. **Initialization:** The Semantic Kernel is initialized, retrieving configuration from the MCP server. 3. **Skill Definition:** Skills are defined using prompts and potentially native code. 4. **Execution:** The Kernel executes skills, leveraging OpenAI for language model interaction. 5. **Dynamic Updates:** The MCP server can update configurations, and the Kernel can be reconfigured dynamically (implementation dependent). **Code Example (Conceptual - Python):** ```python import semantic_kernel as sk from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion, OpenAITextCompletion import requests # For interacting with the MCP server import json # --- 1. Configuration Retrieval from MCP Server --- def get_config_from_mcp(config_url): """Retrieves configuration from the MCP server.""" try: response = requests.get(config_url) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) return response.json() except requests.exceptions.RequestException as e: print(f"Error fetching configuration from MCP: {e}") return None # MCP Server URL (replace with your actual MCP server endpoint) MCP_CONFIG_URL = "http://your-mcp-server/config/sk" config = get_config_from_mcp(MCP_CONFIG_URL) if config is None: print("Failed to retrieve configuration. Exiting.") exit() # Extract configuration values OPENAI_API_KEY = config.get("openai_api_key") OPENAI_ORG_ID = config.get("openai_org_id", None) # Optional OPENAI_CHAT_MODEL = config.get("openai_chat_model", "gpt-3.5-turbo") OPENAI_TEXT_MODEL = config.get("openai_text_model", "text-davinci-003") # Optional # --- 2. Kernel Initialization --- kernel = sk.Kernel() # Add OpenAI connector try: kernel.add_chat_service( "openai-chat", OpenAIChatCompletion(OPENAI_CHAT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) ) # Optionally add a text completion service if needed # kernel.add_text_completion_service( # "openai-text", OpenAITextCompletion(OPENAI_TEXT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) # ) except Exception as e: print(f"Error initializing OpenAI connector: {e}") exit() # --- 3. Skill Definition --- # Define a skill using a prompt template prompt_template = """ You are a helpful assistant. Answer the user's question concisely. User: {{ $input }} Assistant: """ summarize_skill = kernel.create_semantic_function( prompt_template=prompt_template, description="Answers user questions concisely.", max_tokens=200, temperature=0.7, top_p=1, ) # --- 4. Execution --- user_question = "What is the capital of France?" result = await summarize_skill.invoke_async(input=user_question) print(f"User Question: {user_question}") print(f"Answer: {result}") # --- 5. Dynamic Updates (Conceptual) --- # This is a simplified example. A real implementation would likely involve: # - A mechanism for the MCP server to notify the application of configuration changes. # - A function to reload the configuration and re-initialize the Kernel. def reload_config(): """Reloads the configuration from the MCP server and re-initializes the Kernel.""" global kernel, OPENAI_API_KEY, OPENAI_CHAT_MODEL new_config = get_config_from_mcp(MCP_CONFIG_URL) if new_config: new_openai_api_key = new_config.get("openai_api_key") new_openai_chat_model = new_config.get("openai_chat_model", "gpt-3.5-turbo") if new_openai_api_key != OPENAI_API_KEY or new_openai_chat_model != OPENAI_CHAT_MODEL: print("Configuration changed. Re-initializing Kernel.") OPENAI_API_KEY = new_openai_api_key OPENAI_CHAT_MODEL = new_openai_chat_model # Re-initialize the Kernel with the new configuration kernel = sk.Kernel() try: kernel.add_chat_service( "openai-chat", OpenAIChatCompletion(OPENAI_CHAT_MODEL, OPENAI_API_KEY, OPENAI_ORG_ID) ) print("Kernel re-initialized successfully.") except Exception as e: print(f"Error re-initializing OpenAI connector: {e}") else: print("Failed to retrieve new configuration.") # Example of calling the reload function (in a real application, this would be triggered by an event) # reload_config() ``` **Explanation:** * **MCP Integration:** The code uses the `requests` library to fetch configuration data from a hypothetical MCP server. The `MCP_CONFIG_URL` needs to be replaced with the actual endpoint of your MCP server. The MCP server is expected to return a JSON object containing the necessary configuration parameters. * **Configuration Loading:** The `get_config_from_mcp` function retrieves the configuration and handles potential errors. * **Kernel Initialization:** The `OPENAI_API_KEY` and `OPENAI_CHAT_MODEL` are retrieved from the configuration and used to initialize the Semantic Kernel with the OpenAI connector. * **Skill Definition:** A simple skill is defined using a prompt template. This skill takes user input and uses the OpenAI chat model to provide a concise answer. * **Execution:** The skill is executed with a sample user question. * **Dynamic Updates (Conceptual):** The `reload_config` function demonstrates how the Kernel could be reconfigured dynamically if the configuration changes on the MCP server. This is a simplified example; a real implementation would likely involve a more robust mechanism for detecting and handling configuration changes. This might involve polling the MCP server periodically or using a push-based mechanism (e.g., webhooks). **Key Considerations for MCP Integration:** * **Security:** Securely store and transmit the OpenAI API key. Consider using encryption or other security measures. The MCP server should also be secured to prevent unauthorized access to the configuration data. * **Error Handling:** Implement robust error handling to gracefully handle situations where the MCP server is unavailable or returns invalid configuration data. * **Configuration Format:** Define a clear and consistent format for the configuration data stored on the MCP server. * **Dynamic Updates:** Choose an appropriate mechanism for detecting and handling configuration changes. Consider the performance implications of polling the MCP server frequently. * **Version Control:** Implement version control for the configuration data on the MCP server to allow for easy rollback to previous configurations. * **Authentication/Authorization:** Implement authentication and authorization mechanisms to control access to the MCP server and the configuration data. **Korean Translation of Key Concepts:** * **Microsoft Semantic Kernel:** 마이크로소프트 시맨틱 커널 * **OpenAI:** OpenAI (오픈에이아이) * **MCP Server (Management and Configuration Platform):** MCP 서버 (관리 및 구성 플랫폼) * **Configuration:** 구성 (구성 설정) * **API Key:** API 키 * **Model Name:** 모델 이름 * **Skill:** 스킬 (기능) * **Prompt Template:** 프롬프트 템플릿 * **Dynamic Updates:** 동적 업데이트 * **Kernel Initialization:** 커널 초기화 * **Execution:** 실행 * **Error Handling:** 오류 처리 * **Security:** 보안 * **Authentication:** 인증 * **Authorization:** 권한 부여 This example provides a starting point for integrating Microsoft Semantic Kernel with OpenAI and an MCP server. You will need to adapt the code to your specific requirements and environment. Remember to replace the placeholder values with your actual configuration values and MCP server endpoint.

🐋 Docker MCP server

🐋 Docker MCP server

Mirror of

Mcp Servers Wiki Website

Mcp Servers Wiki Website

Binance Market Data MCP Server

Binance Market Data MCP Server

create-mcp-server

create-mcp-server

A comprehensive architecture for building robust Model Context Protocol (MCP) servers with integrated web capabilities

MCP Server Pool

MCP Server Pool

MCP 服务合集

google-workspace-mcp

google-workspace-mcp

Linear

Linear

mcp-server-fetch-typescript MCP Server

mcp-server-fetch-typescript MCP Server

Mirror of