Discover Awesome MCP Servers
Extend your agent with 17,262 capabilities via MCP servers.
- All17,262
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Linear Remote MCP server
Servidor de Protocolo de Contexto de Modelo (MCP) Remoto para Linear.
Nestjs Mcp
Servidor MCP integrado para sua aplicação NestJS
Dune Analytics MCP Server
A Model Context Protocol server that connects AI agents to Dune Analytics data, providing access to DEX metrics, EigenLayer statistics, and Solana token balances through structured tools.
MCP Files
Enables agents to quickly find and edit code in a codebase with surgical precision. Find symbols, edit them everywhere with tools for reading code blocks, searching/replacing text, and making precise line-based modifications.
Flightradar24 MCP Server
Provides access to real-time and historical flight data from Flightradar24 API, enabling users to track live aircraft positions, query flight histories, and retrieve comprehensive aviation information including aircraft, airline, and airport details.
MCP Agentic RAG
Enables querying a vector database of machine learning FAQs and performing web searches using Bright Data proxies through Claude Desktop with structured agent prompts.
RAG Database MCP Server
Enables AI assistants to search and query PDF documents through a local RAG system with vector embeddings. Provides semantic document search capabilities while keeping all data stored locally without external dependencies.
ethereum-validator-queue-mcp
An MCP server that tracks Ethereum’s validator activation and exit queues in real time, enabling AI agents to monitor staking dynamics and network participation trends.
insights-mcp-server
Prova de Conceito do Servidor MCP do Red Hat Insights
Google Calendar Meeting Setup
Enables creating Google Calendar meeting invites with automated authentication via OAuth. Supports scheduling meetings with customizable title, duration, notes, and attendees through a simple command-line or MCP tool interface.
test
test
IBKR MCP Server
Provides AI models with secure access to Interactive Brokers trading data and functionality, enabling account management, market data retrieval, and trading operations through natural language interactions.
Blender MCP Server
Connects AI assistants to Blender for controlling 3D modeling operations through natural language, enabling creation of objects, applying textures and materials, scene management, and transformations with optimized token efficiency.
LeetCode MCP (Model Context Protocol)
Okay, I understand. You want to translate the phrase "MCP Server to generate Leetcode Notes" from English to Portuguese. Here's the translation: **Servidor MCP para gerar Notas do Leetcode** Here's a breakdown of why this translation works: * **MCP Server:** This is kept as "Servidor MCP" because "MCP" is likely an acronym or proper noun and is often left untranslated. "Servidor" is the Portuguese word for "Server." * **to generate:** This translates to "para gerar," meaning "in order to generate" or "to generate." * **Leetcode Notes:** This translates to "Notas do Leetcode." "Notas" means "Notes," and "do Leetcode" means "of Leetcode." Again, "Leetcode" is kept as is because it's a proper noun. Therefore, the complete translation is: **Servidor MCP para gerar Notas do Leetcode**
CHM to Markdown Converter
That translates to: **chm para markdown**
DeBank MCP Server
Enables querying DeFi data through DeBank API, including wallet balances, token prices, NFT collections, protocol positions, transaction history, gas prices, and security analysis across 93+ blockchains through natural language.
MCP Server
A Multi-Agent Conversation Protocol Server that interfaces with the Exa Search API, allowing agents to perform semantic search operations through a standardized protocol.
SAP Ariba Source MCP Server by CData
This project builds a read-only MCP server. For full read, write, update, delete, and action capabilities and a simplified setup, check out our free CData MCP Server for SAP Ariba Source (beta): https://www.cdata.com/download/download.aspx?sku=PBZK-V&type=beta
MemoraМCP
An MCP-powered storage system for AI agents that provides IPFS-secured, verifiable, and sovereign data storage capabilities.
Remote MCP Server on Cloudflare
🤖 MCP Server — Agent X (Powered by Gemini Flash + Twitter API)
AGENTE DE IA CONSTRUÍDO COM SERVIDOR MCP
MCP TMAP Server
A server that connects to SK TMAP API, providing access to public transit routing and geocoding functionality through a standardized interface.
Velo Payments API MCP Server
An MCP server that enables interaction with Velo Payments APIs for global payment operations, automatically generated using AG2's MCP builder from the Velo Payments OpenAPI specification.
Alibaba Cloud Observability MCP Server
Provides tools for accessing Alibaba Cloud observability products including SLS (Log Service) and ARMS (Application Real-time Monitoring Service), allowing any MCP-compatible AI assistants to quickly interact with these services.
Cab Service MCP Server
Enables cab booking and management through natural conversation, allowing users to book rides, cancel bookings, and view driver details. Works with Google Maps MCP to provide comprehensive travel planning with automatic route optimization and cab arrangements between destinations.
Toy MCP Server
A simple reference implementation demonstrating MCP server basics with two toy tools: generating random animals and simulating 20-sided die rolls.
PentestThinkingMCP
An AI-powered penetration testing reasoning engine that provides automated attack path planning, step-by-step guidance for CTFs/HTB challenges, and tool recommendations using Beam Search and MCTS algorithms.
Bitwig MCP Server
Servidor MCP para Bitwig Studio
Sample Model Context Protocol Demos
Here's a collection of examples and concepts related to using the Model Context Protocol (MCP) with AWS, focusing on how it can be applied and what benefits it offers. Keep in mind that the Model Context Protocol is a relatively new and evolving concept, and its adoption within AWS services might vary. This response will cover the general principles and potential applications. **Understanding Model Context Protocol (MCP)** The Model Context Protocol aims to provide a standardized way for models to access contextual information during inference. This context can include: * **User Information:** User ID, location, preferences. * **Session Information:** Current session ID, history of interactions. * **Device Information:** Device type, operating system. * **Environment Information:** Time of day, weather conditions. * **External Data:** Real-time data from databases, APIs, or other services. The goal is to make models more aware of their environment, leading to more accurate and personalized predictions. Instead of hardcoding context into the model or passing it directly in the inference request, MCP provides a structured and potentially more efficient way to manage and access this information. **How MCP Could Be Used with AWS Services** While a direct, fully-fledged "MCP service" might not exist as a standalone AWS offering, the principles of MCP can be implemented and leveraged using various AWS services. Here's how: 1. **Amazon SageMaker:** * **Custom Inference Containers:** You can build custom inference containers for SageMaker that implement the MCP. This involves: * **Defining a Context Provider:** A component within your container that fetches context data from various sources (e.g., DynamoDB, Redis, external APIs). * **Integrating with the Model:** Modifying your model's inference code to query the context provider for relevant information before making predictions. * **Deployment:** Deploying the container to SageMaker endpoints. * **SageMaker Inference Pipelines:** You can create inference pipelines where one step is dedicated to fetching and preparing context data. This step could use AWS Lambda or a custom processing container. The output of this step is then passed to the model inference step. * **SageMaker Feature Store:** While not directly MCP, SageMaker Feature Store provides a centralized repository for features that can be used as context. Your inference code can retrieve features from the Feature Store based on a key (e.g., user ID) and use them during inference. This is a common way to provide contextual information. * **Example Scenario:** A recommendation engine deployed on SageMaker. The inference container uses the user ID from the request to query a DynamoDB table (acting as a context provider) for the user's past purchase history, browsing behavior, and demographic information. This information is then fed into the recommendation model to generate personalized recommendations. 2. **AWS Lambda:** * **Context Enrichment:** Lambda functions can be used to enrich incoming inference requests with context data. The Lambda function receives the initial request, fetches context from various sources (e.g., DynamoDB, API Gateway, S3), and then passes the augmented request to the model endpoint (e.g., a SageMaker endpoint). * **Example Scenario:** An image recognition service. The Lambda function receives an image upload request. It then uses the user's location (obtained from the request headers or a user profile) to fetch weather data from an external API. The weather data is added to the request payload and sent to the image recognition model, which might use this information to improve its accuracy (e.g., recognizing objects that are more likely to be present in certain weather conditions). 3. **Amazon API Gateway:** * **Request Transformation:** API Gateway can be configured to transform incoming requests and add context information. This can involve extracting data from request headers, query parameters, or even making calls to other AWS services (e.g., Lambda) to fetch context data. * **Example Scenario:** A fraud detection service. API Gateway receives a transaction request. It extracts the user's IP address and device information from the request headers. It then uses a Lambda function to geolocate the IP address and identify the device type. This information is added to the request payload and sent to the fraud detection model. 4. **Amazon DynamoDB:** * **Context Storage:** DynamoDB can be used as a fast and scalable storage solution for context data. You can store user profiles, session information, and other relevant data in DynamoDB and retrieve it during inference. * **Example Scenario:** A personalized marketing campaign. The model needs to predict the likelihood of a user clicking on an ad. DynamoDB stores user profiles with information such as age, gender, interests, and past interactions with ads. The inference code retrieves this information from DynamoDB and uses it to personalize the ad prediction. 5. **Amazon ElastiCache (Redis/Memcached):** * **Caching Context Data:** ElastiCache can be used to cache frequently accessed context data, reducing latency and improving performance. This is particularly useful for context data that is relatively static or changes infrequently. * **Example Scenario:** A real-time bidding (RTB) system. The model needs to predict the value of an ad impression. ElastiCache stores frequently accessed data such as user demographics, website categories, and ad performance metrics. The inference code retrieves this information from ElastiCache to make a fast and accurate bid. **Key Considerations for Implementing MCP-like Functionality on AWS:** * **Data Consistency:** Ensure that the context data is consistent and up-to-date. Use appropriate caching strategies and data synchronization mechanisms. * **Latency:** Minimize the latency of fetching context data. Use fast storage solutions (e.g., DynamoDB, ElastiCache) and optimize your queries. * **Security:** Protect the context data from unauthorized access. Use appropriate authentication and authorization mechanisms. * **Scalability:** Design your system to scale to handle a large number of inference requests. Use scalable AWS services such as DynamoDB, Lambda, and API Gateway. * **Cost Optimization:** Optimize the cost of fetching and storing context data. Use appropriate caching strategies and choose the most cost-effective AWS services. * **Monitoring and Logging:** Monitor the performance of your system and log any errors. Use AWS CloudWatch to monitor metrics and logs. **Example Code Snippet (Conceptual - Python with Boto3):** ```python import boto3 import json # Assume you have a SageMaker endpoint and a DynamoDB table for user context sagemaker_client = boto3.client('sagemaker-runtime') dynamodb_client = boto3.client('dynamodb') def get_user_context(user_id): """Fetches user context from DynamoDB.""" try: response = dynamodb_client.get_item( TableName='user_context_table', Key={'user_id': {'S': user_id}} ) if 'Item' in response: return response['Item'] else: return None # User not found except Exception as e: print(f"Error fetching user context: {e}") return None def invoke_sagemaker_endpoint(user_id, input_data): """Invokes the SageMaker endpoint with user context.""" user_context = get_user_context(user_id) if user_context: # Transform DynamoDB item to a more usable format (e.g., a dictionary) context_data = {k: list(v.values())[0] for k, v in user_context.items()} # Simple conversion, adjust as needed # Augment the input data with context input_data['context'] = context_data # Convert input data to JSON for SageMaker payload = json.dumps(input_data) try: response = sagemaker_client.invoke_endpoint( EndpointName='your-sagemaker-endpoint', ContentType='application/json', Body=payload ) result = json.loads(response['Body'].read().decode()) return result except Exception as e: print(f"Error invoking SageMaker endpoint: {e}") return None # Example usage user_id = 'user123' input_data = {'feature1': 0.5, 'feature2': 0.8} # Initial input data prediction = invoke_sagemaker_endpoint(user_id, input_data) if prediction: print(f"Prediction: {prediction}") else: print("Failed to get prediction.") ``` **Explanation of the Code:** 1. **`get_user_context(user_id)`:** This function retrieves user context from a DynamoDB table based on the `user_id`. It uses the `boto3` library to interact with DynamoDB. Error handling is included. It returns `None` if the user is not found or if there's an error. The conversion of the DynamoDB item to a dictionary is a crucial step, and you'll need to adapt it based on the structure of your DynamoDB data. 2. **`invoke_sagemaker_endpoint(user_id, input_data)`:** This function orchestrates the process: * It calls `get_user_context()` to retrieve the user's context. * If context is found, it augments the `input_data` with the context information. This is where you'd structure the context data to be compatible with your model's input requirements. * It converts the augmented `input_data` to a JSON payload. * It invokes the SageMaker endpoint using the `sagemaker-runtime` client. * It parses the response from the endpoint and returns the result. Error handling is included. 3. **Example Usage:** Shows how to call the `invoke_sagemaker_endpoint` function with a `user_id` and some initial `input_data`. **Important Notes:** * **Replace Placeholders:** You *must* replace the placeholder values (e.g., `'user_context_table'`, `'your-sagemaker-endpoint'`) with your actual resource names. * **IAM Permissions:** Ensure that your Lambda function (or the IAM role associated with your SageMaker endpoint) has the necessary IAM permissions to access DynamoDB and invoke the SageMaker endpoint. * **Data Transformation:** The way you transform the DynamoDB item into a dictionary (or other format) will depend on the structure of your data and the expected input format of your model. Pay close attention to this step. * **Error Handling:** The code includes basic error handling, but you should add more robust error handling and logging in a production environment. * **Context Data Structure:** The structure of the `context_data` dictionary should match the expected input format of your model. You might need to perform additional data transformations to ensure compatibility. * **Alternative Context Sources:** You can easily adapt the `get_user_context` function to fetch context data from other sources, such as ElastiCache, S3, or external APIs. **Benefits of Using MCP Principles with AWS:** * **Improved Model Accuracy:** By providing models with access to relevant context, you can improve their accuracy and make more informed predictions. * **Personalization:** MCP enables you to personalize model predictions based on user preferences, location, and other contextual factors. * **Flexibility:** You can easily update and modify the context data without retraining the model. * **Scalability:** AWS services provide the scalability and reliability needed to handle a large number of inference requests. * **Centralized Context Management:** You can manage context data in a centralized location, making it easier to maintain and update. In summary, while a dedicated "Model Context Protocol" service might not be explicitly available on AWS, you can effectively implement the principles of MCP by leveraging various AWS services such as SageMaker, Lambda, API Gateway, DynamoDB, and ElastiCache. The key is to design a system that allows your models to access and utilize relevant context data during inference, leading to more accurate and personalized predictions. The example code provides a starting point for building such a system. Remember to adapt the code and architecture to your specific use case and requirements.
Android MCP
A lightweight bridge enabling AI agents to perform real-world tasks on Android devices such as app navigation, UI interaction, and automated QA testing without requiring computer-vision pipelines or preprogrammed scripts.