Discover Awesome MCP Servers

Extend your agent with 26,604 capabilities via MCP servers.

All26,604
Spotify MCP Server

Spotify MCP Server

Enables Claude to interact with Spotify by searching songs, creating playlists, getting recommendations, and managing your music through your Spotify account.

Moonshot MCP Server Gateway

Moonshot MCP Server Gateway

A lightweight gateway server that provides a unified connection entry point for accessing multiple MCP servers, supporting various protocols including Network and Local Transports.

MCP_server_test

MCP_server_test

Oracle Context MCP Server

Oracle Context MCP Server

Provides AI agents with secure, natural-language access to Oracle Cloud Infrastructure through 69 tools covering services like compute, storage, networking, and databases. It features deep IAM integration and supports deployment across platforms like Claude Desktop, Cursor, and OCI Generative AI.

brain-mcp

brain-mcp

An MCP server for managing Obsidian-style note vaults, providing tools for full-text search, note creation, and backlink tracking. It enables users to navigate, structure, and update their personal knowledge base through natural language.

Solr MCP

Solr MCP

Um servidor Python que permite que assistentes de IA realizem consultas de pesquisa híbrida em índices Apache Solr através do Protocolo de Contexto de Modelo, combinando a precisão de palavras-chave com a compreensão semântica baseada em vetores.

CodeFlow MCP Server

CodeFlow MCP Server

Enables AI assistants to analyze codebases through semantic search, call graph generation, and function metadata extraction. Provides real-time code analysis with persistent vector storage for understanding complex code structures and relationships.

Shopify MCP Server

Shopify MCP Server

Enables interaction with Shopify stores through the GraphQL Admin API. Supports product management, customer data, order queries, blog/article management, and store-wide search capabilities through natural language.

MCP Feedback Collector

MCP Feedback Collector

A modern Model Context Protocol (MCP) server that enables AI assistants to collect interactive user feedback, supporting text and image-based responses.

Planning System MCP Server

Planning System MCP Server

Enables AI agents to create, manage, and search hierarchical plans with phases, tasks, and milestones through a comprehensive planning API. Supports CRUD operations, batch updates, rich context retrieval, and artifact management for structured project planning.

1C Code Search MCP

1C Code Search MCP

Enables semantic search through 1C codebase exports using local CPU-based RAG with sentence transformers and FAISS indexing. Supports fast XML file indexing and retrieval of 1C code with metadata parsing.

Veo 3.1 MCP Server

Veo 3.1 MCP Server

Enables high-quality AI video generation using Google's Veo 3.1 model for text-to-video, style-guided, and frame-interpolation tasks. It features token-efficient reference image handling, batch processing, and video extension capabilities with built-in cost estimation.

MemoDB MCP Server

MemoDB MCP Server

Gerencia o contexto de conversas com IA e bases de conhecimento pessoais através do Protocolo de Contexto de Modelo (MCP), fornecendo ferramentas para dados do usuário, conteúdo da conversa e gerenciamento de conhecimento.

SmartThings MCP Server

SmartThings MCP Server

Enables control and monitoring of SmartThings smart home devices through natural language, supporting switches, sensors, refrigerators, and other IoT devices with real-time status queries and command execution.

Tiling Trees MCP Server

Tiling Trees MCP Server

Enables exploration and organization of research ideas using a hierarchical tile-based method with support for creating interconnected nodes (questions, hypotheses, methods, results), analyzing research gaps, and exporting to multiple visualization formats.

BeeMCP - A Bee MCP Server

BeeMCP - A Bee MCP Server

BeeMCP: um servidor não oficial do Protocolo de Contexto do Modelo (MCP) que conecta seu wearable Bee de registro de vida à IA através do Protocolo de Contexto do Modelo.

text-count-mcp-server

text-count-mcp-server

Um servidor de Protocolo de Contexto de Modelo para contagem de texto.

Remote MCP with Azure Functions

Remote MCP with Azure Functions

A template for building and deploying custom remote MCP servers on Azure Functions with Python, allowing for saving and retrieving snippets with built-in security using keys and HTTPS.

Google Cloud Logging MCP Server

Google Cloud Logging MCP Server

Enables AI assistants to query, search, and analyze logs across Google Cloud Platform projects. It supports advanced filtering by severity or resource type and provides detailed log entry retrieval and project listing capabilities.

Azure Analysis Services MCP Server by CData

Azure Analysis Services MCP Server by CData

Azure Analysis Services MCP Server by CData

Trade Agent

Trade Agent

Trade Agent

MCP Docker Demo

MCP Docker Demo

A demonstration MCP server implementation with Docker support that provides a simple hello-world tool and includes a web-based inspector for interactive testing and exploration of MCP tools.

MCP File System

MCP File System

Um servidor que implementa o Protocolo de Contexto de Modelo e que fornece operações de sistema de arquivos (leitura/escrita, gerenciamento de diretórios, movimentação de arquivos) através de uma interface padronizada com controles de segurança para diretórios permitidos.

EdgeOne Pages MCP: Geo Location Service

EdgeOne Pages MCP: Geo Location Service

Enables AI models to retrieve user geolocation information through EdgeOne Pages Functions. Provides access to location data via the Model Context Protocol for integration with large language models.

Inbound Email MCP Server

Inbound Email MCP Server

Enables AI assistants to interact with the Inbound Email API to manage domains, endpoints, and email communications. Users can send or schedule emails, manage webhooks, and retrieve email threads through natural language commands.

MCP fal.ai Image Server

MCP fal.ai Image Server

Enables AI-powered image generation from text prompts using fal.ai models directly within IDEs. Supports multiple models, customizable parameters, and saves generated images locally with accessible file paths.

kaggle-mcp

kaggle-mcp

Okay, I understand. I can act as a helpful assistant that translates text from English to Portuguese and also describes how I can interact with the Kaggle API to provide tools for searching and downloading datasets, and a prompt for generating EDA notebooks. Here's a breakdown of how I can do that: **1. Translation (English to Portuguese):** * I can translate any English text you provide into Portuguese. Just give me the English text, and I'll provide the Portuguese translation. For example: * **You:** "Hello, how are you?" * **Me:** "Olá, como você está?" **2. Kaggle API Interaction (Description):** While I can't directly execute code or interact with external APIs like the Kaggle API, I can describe how you would use it and provide the necessary code snippets and explanations. Here's how it would work: * **Kaggle API Setup:** You would need to install the Kaggle API client on your system (usually via `pip install kaggle`). You'd also need to authenticate by downloading your `kaggle.json` file from your Kaggle account settings and placing it in the appropriate directory (usually `~/.kaggle/`). * **Searching for Datasets:** You can use the Kaggle API to search for datasets based on keywords. Here's an example of how you would do it in Python: ```python import os os.environ['KAGGLE_CONFIG_DIR'] = '/content' # or wherever your kaggle.json is from kaggle.api.kaggle_api_extended import KaggleApi api = KaggleApi() api.authenticate() search_term = "machine learning" # Example search term datasets = api.dataset_list(search=search_term) for dataset in datasets: print(f"Dataset: {dataset.title}") print(f"URL: https://www.kaggle.com/datasets/{dataset.ref}") print(f"Description: {dataset.description}") print("-" * 20) ``` * **Explanation:** * The code imports the necessary libraries. * `api.authenticate()` authenticates you with the Kaggle API using your credentials. * `api.dataset_list(search=search_term)` searches for datasets matching the `search_term`. * The code then iterates through the results and prints the dataset title, URL, and description. * **Downloading Datasets:** Once you've found a dataset you want to download, you can use the Kaggle API to download it. Here's an example: ```python import os os.environ['KAGGLE_CONFIG_DIR'] = '/content' # or wherever your kaggle.json is from kaggle.api.kaggle_api_extended import KaggleApi api = KaggleApi() api.authenticate() dataset_name = "username/dataset-name" # Replace with the actual dataset name (e.g., "uciml/iris") download_path = "./data" # The directory where you want to download the dataset api.dataset_download_files(dataset_name, path=download_path, unzip=True) print(f"Dataset '{dataset_name}' downloaded to '{download_path}'") ``` * **Explanation:** * `dataset_name` should be replaced with the actual name of the dataset (found in the dataset's URL on Kaggle). * `download_path` specifies the directory where the dataset will be downloaded. * `api.dataset_download_files()` downloads the dataset and, if `unzip=True`, automatically unzips it. **3. Prompt for Generating EDA Notebooks:** I can provide a prompt that you can use with a large language model (LLM) like GPT-3 or similar to generate an EDA (Exploratory Data Analysis) notebook. The prompt would include instructions on what kind of analysis to perform. Here's an example prompt: ``` You are an expert data scientist. Generate a Python notebook using Pandas, Matplotlib, and Seaborn to perform exploratory data analysis (EDA) on the [DATASET_NAME] dataset. The dataset is located at [DATASET_PATH]. The notebook should include the following sections: 1. **Introduction:** A brief overview of the dataset and the goals of the EDA. 2. **Data Loading and Inspection:** * Load the dataset into a Pandas DataFrame. * Display the first few rows of the DataFrame. * Print the shape of the DataFrame. * Print the data types of each column. * Check for missing values and handle them appropriately (e.g., imputation or removal). 3. **Descriptive Statistics:** * Calculate and display descriptive statistics for numerical columns (mean, median, standard deviation, min, max, etc.). * Calculate and display value counts for categorical columns. 4. **Data Visualization:** * Create histograms for numerical columns to visualize their distributions. * Create box plots for numerical columns to identify outliers. * Create bar charts for categorical columns to visualize their frequencies. * Create scatter plots to explore relationships between numerical variables. * Create heatmaps to visualize correlations between numerical variables. 5. **Insights and Conclusions:** Summarize the key findings from the EDA and draw conclusions about the dataset. Be sure to include clear and concise comments throughout the notebook to explain each step. Use appropriate visualizations to effectively communicate the data insights. Assume the dataset is a CSV file. Replace [DATASET_NAME] with the actual name of the dataset and [DATASET_PATH] with the actual path to the dataset file. ``` * **Explanation:** * This prompt tells the LLM to act as a data scientist and generate a Python notebook. * It specifies the libraries to use (Pandas, Matplotlib, Seaborn). * It outlines the sections that the notebook should include, covering data loading, inspection, descriptive statistics, and data visualization. * It provides specific instructions for each section, such as creating histograms, box plots, and scatter plots. * It emphasizes the importance of clear comments and effective visualizations. * It includes placeholders for the dataset name and path, which you would need to replace with the actual values. **How to Use Me:** 1. **Translation:** Provide me with the English text you want to translate to Portuguese. 2. **Kaggle API:** Ask me how to perform a specific task with the Kaggle API (e.g., "How do I search for datasets about image classification?"). I'll provide the code snippet and explanation. Remember that you'll need to execute the code yourself. 3. **EDA Notebook Generation:** Ask me to generate a prompt for creating an EDA notebook. I'll provide a prompt like the one above, which you can then use with an LLM. Let me know what you'd like me to do first!

DBeaver MCP Server

DBeaver MCP Server

A Model Context Protocol server that enables AI assistants to access and query 200+ database types through existing DBeaver connections without additional configuration.

Bunq MCP

Bunq MCP

A Model Context Protocol server for Bunq that enables interaction with the Bunq banking API through OAuth integration.

Prospectio MCP API

Prospectio MCP API

A FastAPI-based application that implements the Model Context Protocol for lead prospecting, allowing users to retrieve business leads from different data sources like Mantiks through a clean architecture.