Discover Awesome MCP Servers

Extend your agent with 16,638 capabilities via MCP servers.

All16,638
SkySQL MCP Integration

SkySQL MCP Integration

mcp-cbs-cijfers-open-data

mcp-cbs-cijfers-open-data

Máy chủ MCP để làm việc với Dữ liệu Mở CBS Cijfers

Prompt Decorators

Prompt Decorators

A standardized framework for enhancing how LLMs process and respond to prompts through composable decorators, featuring an official open standard specification and Python reference implementation with MCP server integration.

Model Context Protocol (MCP) Server 🚀

Model Context Protocol (MCP) Server 🚀

mcp-server-bluesky

mcp-server-bluesky

Mirror of

artifacts-mcp

artifacts-mcp

MCP Server for Artifacts MMO

Knowledge Graph Memory Server

Knowledge Graph Memory Server

Mirror of

MCP Server Docker

MCP Server Docker

Máy chủ MCP cho Docker

Weather MCP Server

Weather MCP Server

Flights Mcp Server

Flights Mcp Server

MCP Server for Google Flights !!

gatherings MCP Server

gatherings MCP Server

Một máy chủ giao thức ngữ cảnh mô hình (Model Context Protocol server) giúp theo dõi chi phí và tính toán hoàn trả cho các sự kiện xã hội, giúp dễ dàng thanh toán số dư giữa bạn bè.

Linear MCP Server

Linear MCP Server

Mirror of

testmcpgithubdemo1

testmcpgithubdemo1

created from MCP server demo

Simple Memory Extension MCP Server

Simple Memory Extension MCP Server

Một máy chủ MCP (Memory, Context, and Persistence) mở rộng cửa sổ ngữ cảnh của các tác nhân AI bằng cách cung cấp các công cụ để lưu trữ, truy xuất và tìm kiếm ký ức, cho phép các tác nhân duy trì lịch sử và ngữ cảnh trong các tương tác dài.

Telegram MCP Server

Telegram MCP Server

MCP server to send notifications to Telegram

ChatGPT MCP Server

ChatGPT MCP Server

Mirror of

Apache Doris MCP Server

Apache Doris MCP Server

An MCP server for Apache Doris & VeloDB

Choose MCP Server Setup

Choose MCP Server Setup

Gương của

mock-assistant-mcp-server

mock-assistant-mcp-server

Trợ lý máy chủ MCP cho dữ liệu mô phỏng

McpDocs

McpDocs

Provide documentation about your Elixir project's functions and functions of dependencies to an LLM through an SSE MCP server.

MCP System Monitor

MCP System Monitor

A system monitoring tool that exposes system metrics via the Model Context Protocol (MCP). This tool allows LLMs to retrieve real-time system information through an MCP-compatible interface.

mpc-csharp-semantickernel

mpc-csharp-semantickernel

Okay, here's an example demonstrating how to use Microsoft Semantic Kernel with OpenAI and a hypothetical "MCP Server" (assuming MCP stands for something like "Model Control Plane" or "Model Configuration Provider"). I'll outline the code structure, explain the concepts, and provide a basic implementation. Keep in mind that the "MCP Server" part is conceptual, as there isn't a standard, universally defined MCP Server. You'll need to adapt the MCP Server interaction to your specific implementation. **Conceptual Overview** 1. **Semantic Kernel:** The core framework for orchestrating AI tasks. It allows you to define skills (functions) that can be chained together to achieve complex goals. 2. **OpenAI:** Provides the large language models (LLMs) that power the AI capabilities. We'll use the OpenAI connector in Semantic Kernel to interact with OpenAI's APIs. 3. **MCP Server (Hypothetical):** This server is responsible for managing model configurations, potentially including: * API keys for OpenAI (or other LLM providers). * Model names (e.g., "gpt-3.5-turbo", "gpt-4"). * Temperature, top\_p, and other model parameters. * Rate limiting policies. * Potentially, even A/B testing configurations for different models. **Code Structure (Conceptual)** ```python # Assuming you have installed: # pip install semantic-kernel python-dotenv openai import semantic_kernel as sk from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion import os from dotenv import load_dotenv import requests # For interacting with the MCP Server # Load environment variables (e.g., from a .env file) load_dotenv() # --- 1. MCP Server Interaction (Hypothetical) --- MCP_SERVER_URL = os.getenv("MCP_SERVER_URL", "http://localhost:8000") # Default URL def get_model_config(model_name: str): """ Fetches model configuration from the MCP Server. Args: model_name: The name of the model to retrieve configuration for. Returns: A dictionary containing the model configuration, or None if not found. """ try: response = requests.get(f"{MCP_SERVER_URL}/models/{model_name}") response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) return response.json() except requests.exceptions.RequestException as e: print(f"Error fetching model config from MCP Server: {e}") return None # --- 2. Semantic Kernel Setup --- async def main(): kernel = sk.Kernel() # Get OpenAI configuration from MCP Server model_name = "gpt-3.5-turbo" # Or "gpt-4", etc. model_config = get_model_config(model_name) if model_config is None: print("Failed to retrieve model configuration. Using environment variables.") # Fallback to environment variables if MCP Server is unavailable api_key = os.getenv("OPENAI_API_KEY") org_id = os.getenv("OPENAI_ORG_ID") if not api_key: raise Exception("OpenAI API key not found in environment variables or MCP Server.") kernel.add_chat_service("openai", OpenAIChatCompletion(model_name, api_key, org_id)) else: # Use configuration from MCP Server api_key = model_config.get("api_key") org_id = model_config.get("org_id") temperature = model_config.get("temperature", 0.7) # Default temperature top_p = model_config.get("top_p", 1.0) # Default top_p if not api_key: raise Exception("OpenAI API key not found in MCP Server configuration.") kernel.add_chat_service( "openai", OpenAIChatCompletion( model_name, api_key, org_id, temperature=temperature, top_p=top_p ) ) # --- 3. Define a Skill --- # Simple example: A skill to summarize text summarize_prompt = """ Summarize the following text: {{$text}} """ summarize_function = kernel.create_semantic_function( prompt_template=summarize_prompt, description="Summarizes text", max_tokens=200, # Limit the output length temperature=0.7, top_p=1.0, ) # --- 4. Use the Skill --- text_to_summarize = """ Microsoft Semantic Kernel is a powerful framework for building AI-powered applications. It allows developers to easily integrate large language models (LLMs) like OpenAI's GPT-3 and GPT-4 into their applications. Semantic Kernel provides a set of tools and abstractions for orchestrating AI tasks, managing prompts, and chaining skills together. This makes it easier to create complex AI workflows. """ context_variables = sk.ContextVariables(text_to_summarize) summary = await summarize_function.invoke(context_variables) print("Original Text:") print(text_to_summarize) print("\nSummary:") print(summary) if __name__ == "__main__": import asyncio asyncio.run(main()) ``` **Explanation:** 1. **MCP Server Interaction:** * `get_model_config(model_name)`: This function is the key to integrating with your MCP Server. It sends an HTTP GET request to the server to retrieve the configuration for a specific model. The URL (`MCP_SERVER_URL/models/{model_name}`) is just an example; adjust it to match your server's API. * Error Handling: The `try...except` block handles potential network errors or HTTP errors (e.g., 404 Not Found if the model isn't configured). * Configuration Retrieval: The function expects the MCP Server to return a JSON response containing the model's configuration (API key, model name, temperature, etc.). 2. **Semantic Kernel Setup:** * `kernel = sk.Kernel()`: Creates a Semantic Kernel instance. * `get_model_config()` is called to fetch the OpenAI configuration. * Conditional Logic: If the MCP Server is unavailable or doesn't have the configuration, the code falls back to using environment variables (a common practice). This provides a backup mechanism. * `kernel.add_chat_service()`: This is where you register the OpenAI connector with the Semantic Kernel. It's crucial to pass the API key and other parameters (temperature, top\_p) obtained from either the MCP Server or the environment variables. 3. **Skill Definition:** * `summarize_prompt`: A simple prompt that instructs the LLM to summarize the input text. The `{{$text}}` is a placeholder that will be replaced with the actual text to be summarized. * `kernel.create_semantic_function()`: Creates a semantic function (a skill) from the prompt. You can also specify other parameters like `max_tokens` (to limit the output length), `temperature`, and `top_p`. 4. **Skill Usage:** * `text_to_summarize`: The text you want to summarize. * `context_variables = sk.ContextVariables(text_to_summarize)`: Creates a context object to pass the input text to the skill. * `summary = await summarize_function.invoke(context_variables)`: Invokes the skill and gets the result. * Prints the original text and the summary. **MCP Server Implementation (Example - Flask)** Here's a very basic example of how you might implement the MCP Server using Flask (a Python web framework): ```python from flask import Flask, jsonify import os from dotenv import load_dotenv load_dotenv() app = Flask(__name__) # In a real application, this would likely be stored in a database model_configurations = { "gpt-3.5-turbo": { "api_key": os.getenv("OPENAI_API_KEY"), "org_id": os.getenv("OPENAI_ORG_ID"), "temperature": 0.7, "top_p": 1.0, }, "gpt-4": { "api_key": os.getenv("OPENAI_API_KEY"), "org_id": os.getenv("OPENAI_ORG_ID"), "temperature": 0.5, "top_p": 0.9, }, } @app.route("/models/<model_name>") def get_model(model_name): if model_name in model_configurations: return jsonify(model_configurations[model_name]) else: return jsonify({"error": "Model not found"}), 404 if __name__ == "__main__": app.run(debug=True, port=8000) ``` **To Run the Example:** 1. **Install Dependencies:** ```bash pip install semantic-kernel python-dotenv openai flask requests ``` 2. **Set Environment Variables:** * Create a `.env` file in the same directory as your Python scripts. * Add your OpenAI API key and organization ID to the `.env` file: ``` OPENAI_API_KEY=YOUR_OPENAI_API_KEY OPENAI_ORG_ID=YOUR_OPENAI_ORG_ID MCP_SERVER_URL=http://localhost:8000 # Optional, if you want to override the default ``` * Replace `YOUR_OPENAI_API_KEY` and `YOUR_OPENAI_ORG_ID` with your actual credentials. 3. **Run the MCP Server:** ```bash python your_mcp_server_script.py # Replace with the actual name of your MCP server script ``` 4. **Run the Semantic Kernel Script:** ```bash python your_semantic_kernel_script.py # Replace with the actual name of your Semantic Kernel script ``` **Important Considerations:** * **Security:** Never hardcode API keys directly into your code. Use environment variables or a secure configuration management system (like the MCP Server) to store sensitive information. * **Error Handling:** Implement robust error handling to gracefully handle network errors, API errors, and other potential issues. * **Scalability:** For production environments, consider using a more scalable MCP Server implementation (e.g., using a database to store model configurations and a more robust web framework). * **MCP Server Design:** The design of your MCP Server will depend on your specific requirements. You might want to add features like: * Authentication and authorization. * Rate limiting. * Model versioning. * A/B testing. * Monitoring and logging. * **Asynchronous Operations:** Use `async` and `await` for I/O-bound operations (like network requests) to improve performance. * **Prompt Engineering:** Experiment with different prompts to optimize the performance of your skills. * **Semantic Kernel Documentation:** Refer to the official Microsoft Semantic Kernel documentation for the most up-to-date information and examples: [https://learn.microsoft.com/en-us/semantic-kernel/](https://learn.microsoft.com/en-us/semantic-kernel/) This comprehensive example should give you a solid foundation for using Microsoft Semantic Kernel with OpenAI and a custom MCP Server. Remember to adapt the MCP Server implementation to your specific needs and security requirements.

🐋 Docker MCP server

🐋 Docker MCP server

Mirror of

Mcp Servers Wiki Website

Mcp Servers Wiki Website

Binance Market Data MCP Server

Binance Market Data MCP Server

create-mcp-server

create-mcp-server

A comprehensive architecture for building robust Model Context Protocol (MCP) servers with integrated web capabilities

MCP Server Pool

MCP Server Pool

MCP 服务合集

google-workspace-mcp

google-workspace-mcp

Linear

Linear

mcp-server-fetch-typescript MCP Server

mcp-server-fetch-typescript MCP Server

Mirror of