Discover Awesome MCP Servers

Extend your agent with 26,560 capabilities via MCP servers.

All26,560
Security Scanner MCP

Security Scanner MCP

Automatically detects security vulnerabilities in AI-generated code, scanning for hardcoded secrets, injection flaws, XSS, weak cryptography, authentication issues, path traversal, and vulnerable dependencies across JavaScript, Python, Java, and Go.

Linear MCP Server

Linear MCP Server

Facilita o gerenciamento de projetos com a API Linear através do Protocolo de Contexto de Modelo, permitindo que os usuários gerenciem iniciativas, projetos, problemas e seus relacionamentos por meio de recursos como criação, visualização, atualização e priorização.

SpinupWP MCP Server

SpinupWP MCP Server

Enables AI assistants to manage SpinupWP infrastructure, including servers, WordPress sites, and SSH keys through the SpinupWP v1 JSON API. Users can perform actions like provisioning sites, purging caches, and restarting services via natural language commands.

GoLogin MCP

GoLogin MCP

Manage your GoLogin browser profiles and automation directly through AI conversations. This MCP server connects to the GoLogin API, letting you create, configure, and control browser profiles using natural language.

Terros MCP Server

Terros MCP Server

Enables interaction with the Terros platform's User API to manage and retrieve user profiles. Users can fetch authenticated account details, list users with filters, and lookup specific user information by ID.

EKMS MCP Server

EKMS MCP Server

Enables Claude to search, query, and interact with an Enterprise Knowledge Management System (EKMS). Supports semantic search, knowledge recommendations, relationship graphs, and feedback recording for enterprise knowledge bases.

Grammarly MCP Server

Grammarly MCP Server

Automates Grammarly's web interface to check AI detection and plagiarism scores, then uses Claude to iteratively rewrite text until it meets target thresholds for humanized content.

Natural Voice MCP

Natural Voice MCP

Provides tools and resources to detect AI-generated writing patterns and refine text for more authentic, human-like communication. It enables users to analyze phrasing via a scoring system and apply conversational guides tailored for platforms like Twitter and LinkedIn.

PrestaShop MCP Server

PrestaShop MCP Server

Enables complete management of PrestaShop e-commerce stores through natural language, including products, categories, customers, orders, modules, cache, themes, and navigation menus.

TimescaleDB MCP Server

TimescaleDB MCP Server

Enables AI assistants to interact with TimescaleDB time-series databases through async operations, providing tools for querying, schema introspection, hypertable analysis, and time-bucketed data aggregation.

server

server

Uma plataforma de automação de fluxo de trabalho leve e sem dependências. Suporta capacidades de iPaaS, computação de fluxo e IA.

MCP Jobs Recommender Agent

MCP Jobs Recommender Agent

An AI-powered job recommender built on the Model Context Protocol that provides personalized career suggestions through resume and skills matching. It utilizes LLMs and external tool integrations to help users find relevant job openings based on their professional profile.

MCP Splunk

MCP Splunk

A security-focused MCP server that enables automated log retrieval and threat analysis using LangGraph orchestration and RAG. It allows users to detect suspicious activity and generate structured security insights by integrating LLM reasoning with log data and runbook documentation.

AIM-Guard-MCP

AIM-Guard-MCP

A Model Context Protocol (MCP) server that provides AI-powered security analysis and safety instruction tools. This server helps protect AI agents by providing security guidelines, content analysis, and cautionary instructions when interacting with various MCPs and external services.

DateTime MCP Server

DateTime MCP Server

Provides the current date and time in the user's timezone through a simple tool with no parameters required.

devto-mvp-server

devto-mvp-server

This is a complete MCP (Model Context Protocol) server that implements a articles of dev.to with robust validation using TypeScript and Zod. The server integrates directly with Cursor, allowing you search articles on dev.to.

MCP Nexus - Cloudflare Workers

MCP Nexus - Cloudflare Workers

A unified MCP server for Tavily and Brave Search APIs that features automatic API key rotation and zero-cost hosting on Cloudflare Workers. It provides tools for web searching, data extraction, and research with secure, encrypted key management.

Remote MCP Server on Cloudflare

Remote MCP Server on Cloudflare

A Model Context Protocol server implementation designed to run on Cloudflare Workers with integrated OAuth authentication. It enables hosting and securely accessing MCP tools remotely via SSE transport from clients like Claude Desktop.

Darwin Standards MCP Server

Darwin Standards MCP Server

Provides standards documentation and validation tools for the Darwin platform, including naming conventions and agent card validation. It enables AI agents to search platform standards and access resource templates for design and implementation.

cognee-mcp

cognee-mcp

Gerenciador de memória para aplicativos e agentes de IA, utilizando diversos armazenamentos de grafos e vetores e permitindo a ingestão de mais de 30 fontes de dados.

Taiwan FDA Drug Search MCP Server

Taiwan FDA Drug Search MCP Server

Enables searching Taiwan FDA drug database by various criteria (English/Chinese names, license numbers, ingredients), retrieving drug information, and downloading PDF package inserts with CAPTCHA verification support.

CDK API MCP Server

CDK API MCP Server

Provides offline access to AWS CDK API references and integration test code samples for both stable (aws-cdk-lib) and alpha (@aws-cdk) modules, enabling developers to browse CDK documentation through MCP resources.

MCP-Driven Data Management System

MCP-Driven Data Management System

Enables natural language interaction with heterogeneous databases (MySQL and PostgreSQL) for CRUD operations across customer, product, and sales data with intelligent query routing and visualization capabilities.

astllm-mcp

astllm-mcp

An MCP server for efficient code indexing and symbol retrieval using tree-sitter AST parsing to fetch specific functions or classes without loading entire files. It significantly reduces AI token costs by providing O(1) byte-offset access to code components across multiple programming languages.

Google MCP Router

Google MCP Router

Enables scheduling meetings and sending email confirmations through Google Calendar and Gmail APIs. Provides secure OAuth authentication, policy enforcement for working hours, and prevents scheduling conflicts.

MCP Project

MCP Project

Servidores e clientes MCP em sandbox.

Guardian News MCP Server

Guardian News MCP Server

Enables users to search for the latest news articles from The Guardian using keywords and check service status. Provides access to Guardian's news content through their API with configurable result limits.

Track MCP

Track MCP

A professional tool for automatically generating enterprise-standard tracking code for H5, mini-programs, and native applications. It streamlines event tracking for API calls, user clicks, and UI interactions while supporting JIRA integration for task-specific tracking.

Inbound Email MCP Server

Inbound Email MCP Server

Enables AI assistants to interact with the Inbound Email API to manage domains, endpoints, and email communications. Users can send or schedule emails, manage webhooks, and retrieve email threads through natural language commands.

kaggle-mcp

kaggle-mcp

Okay, I understand. I can act as a helpful assistant that translates text from English to Portuguese and also describes how I can interact with the Kaggle API to provide tools for searching and downloading datasets, and a prompt for generating EDA notebooks. Here's a breakdown of how I can do that: **1. Translation (English to Portuguese):** * I can translate any English text you provide into Portuguese. Just give me the English text, and I'll provide the Portuguese translation. For example: * **You:** "Hello, how are you?" * **Me:** "Olá, como você está?" **2. Kaggle API Interaction (Description):** While I can't directly execute code or interact with external APIs like the Kaggle API, I can describe how you would use it and provide the necessary code snippets and explanations. Here's how it would work: * **Kaggle API Setup:** You would need to install the Kaggle API client on your system (usually via `pip install kaggle`). You'd also need to authenticate by downloading your `kaggle.json` file from your Kaggle account settings and placing it in the appropriate directory (usually `~/.kaggle/`). * **Searching for Datasets:** You can use the Kaggle API to search for datasets based on keywords. Here's an example of how you would do it in Python: ```python import os os.environ['KAGGLE_CONFIG_DIR'] = '/content' # or wherever your kaggle.json is from kaggle.api.kaggle_api_extended import KaggleApi api = KaggleApi() api.authenticate() search_term = "machine learning" # Example search term datasets = api.dataset_list(search=search_term) for dataset in datasets: print(f"Dataset: {dataset.title}") print(f"URL: https://www.kaggle.com/datasets/{dataset.ref}") print(f"Description: {dataset.description}") print("-" * 20) ``` * **Explanation:** * The code imports the necessary libraries. * `api.authenticate()` authenticates you with the Kaggle API using your credentials. * `api.dataset_list(search=search_term)` searches for datasets matching the `search_term`. * The code then iterates through the results and prints the dataset title, URL, and description. * **Downloading Datasets:** Once you've found a dataset you want to download, you can use the Kaggle API to download it. Here's an example: ```python import os os.environ['KAGGLE_CONFIG_DIR'] = '/content' # or wherever your kaggle.json is from kaggle.api.kaggle_api_extended import KaggleApi api = KaggleApi() api.authenticate() dataset_name = "username/dataset-name" # Replace with the actual dataset name (e.g., "uciml/iris") download_path = "./data" # The directory where you want to download the dataset api.dataset_download_files(dataset_name, path=download_path, unzip=True) print(f"Dataset '{dataset_name}' downloaded to '{download_path}'") ``` * **Explanation:** * `dataset_name` should be replaced with the actual name of the dataset (found in the dataset's URL on Kaggle). * `download_path` specifies the directory where the dataset will be downloaded. * `api.dataset_download_files()` downloads the dataset and, if `unzip=True`, automatically unzips it. **3. Prompt for Generating EDA Notebooks:** I can provide a prompt that you can use with a large language model (LLM) like GPT-3 or similar to generate an EDA (Exploratory Data Analysis) notebook. The prompt would include instructions on what kind of analysis to perform. Here's an example prompt: ``` You are an expert data scientist. Generate a Python notebook using Pandas, Matplotlib, and Seaborn to perform exploratory data analysis (EDA) on the [DATASET_NAME] dataset. The dataset is located at [DATASET_PATH]. The notebook should include the following sections: 1. **Introduction:** A brief overview of the dataset and the goals of the EDA. 2. **Data Loading and Inspection:** * Load the dataset into a Pandas DataFrame. * Display the first few rows of the DataFrame. * Print the shape of the DataFrame. * Print the data types of each column. * Check for missing values and handle them appropriately (e.g., imputation or removal). 3. **Descriptive Statistics:** * Calculate and display descriptive statistics for numerical columns (mean, median, standard deviation, min, max, etc.). * Calculate and display value counts for categorical columns. 4. **Data Visualization:** * Create histograms for numerical columns to visualize their distributions. * Create box plots for numerical columns to identify outliers. * Create bar charts for categorical columns to visualize their frequencies. * Create scatter plots to explore relationships between numerical variables. * Create heatmaps to visualize correlations between numerical variables. 5. **Insights and Conclusions:** Summarize the key findings from the EDA and draw conclusions about the dataset. Be sure to include clear and concise comments throughout the notebook to explain each step. Use appropriate visualizations to effectively communicate the data insights. Assume the dataset is a CSV file. Replace [DATASET_NAME] with the actual name of the dataset and [DATASET_PATH] with the actual path to the dataset file. ``` * **Explanation:** * This prompt tells the LLM to act as a data scientist and generate a Python notebook. * It specifies the libraries to use (Pandas, Matplotlib, Seaborn). * It outlines the sections that the notebook should include, covering data loading, inspection, descriptive statistics, and data visualization. * It provides specific instructions for each section, such as creating histograms, box plots, and scatter plots. * It emphasizes the importance of clear comments and effective visualizations. * It includes placeholders for the dataset name and path, which you would need to replace with the actual values. **How to Use Me:** 1. **Translation:** Provide me with the English text you want to translate to Portuguese. 2. **Kaggle API:** Ask me how to perform a specific task with the Kaggle API (e.g., "How do I search for datasets about image classification?"). I'll provide the code snippet and explanation. Remember that you'll need to execute the code yourself. 3. **EDA Notebook Generation:** Ask me to generate a prompt for creating an EDA notebook. I'll provide a prompt like the one above, which you can then use with an LLM. Let me know what you'd like me to do first!