
Cloudinary
Cloudinary MCP server enables AI agents to upload, manage, analyze transform, optimize and deliver media using Cloudinary’s API.
README
Cloudinary MCP Servers
Model Context Protocol (MCP) is a new, standardized protocol for managing context between large language models (LLMs) and external systems. This repository provides comprehensive MCP servers for Cloudinary's media management platform, enabling you to use natural language to upload, transform, analyze, and organize your media assets directly from AI applications like Cursor and Claude.
With these MCP servers, you can seamlessly manage your entire media workflow through conversational AI - from uploading and transforming images and videos, to configuring automated processing pipelines, analyzing content with AI-powered tools, and organizing assets with structured metadata. Whether you're building media-rich applications, managing large asset libraries, or automating content workflows, these servers provide direct access to Cloudinary's full suite of media optimization and management capabilities.
The following MCP servers are available for Cloudinary:
Server Name | Description | GitHub Repository | Install |
---|---|---|---|
Asset Management | Upload, manage, and transform your media assets with advanced search and organization capabilities | @cloudinary/asset-management | |
Environment Config | Configure and manage your Cloudinary environment settings, upload presets, and transformations | @cloudinary/environment-config | |
Structured Metadata | Create, manage, and query structured metadata fields for enhanced asset organization and searchability | @cloudinary/structured-metadata | |
Analysis | Leverage AI-powered content analysis, moderation, and auto-tagging capabilities for your media assets | @cloudinary/analysis | |
MediaFlows | Build and manage low-code workflow automations for images and videos with AI-powered assistance | MediaFlows MCP |
Table of Contents
- Documentation
- Configuration
- Using Cloudinary's MCP servers with OpenAI Responses API
- Authentication
- Features by Server
- Need access to more Cloudinary tools?
- Troubleshooting
- Paid Features
- License
Documentation
For detailed guides, tutorials, and comprehensive documentation on using Cloudinary's MCP servers:
- Cloudinary MCP and LLM Tool Documentation - Complete guide to integrating Cloudinary with AI/LLM applications
- MediaFlows MCP Documentation - Setup instructions and guidelines for using the MediaFlows (MCP) server
Configuration
Quick Install with Cursor Deeplinks
For Cursor users, you can install MCP servers with one click using Cursor deeplinks:
Note: You'll need to update the environment variables (CLOUDINARY_CLOUD_NAME
, CLOUDINARY_API_KEY
, CLOUDINARY_API_SECRET
) with your actual credentials after installation.
Manual Configuration
You can also run the MCP servers using the individual npm packages. There are several ways to configure authentication:
Option 1: Using individual environment variables
{
"mcpServers": {
"cloudinary-asset-mgmt": {
"command": "npx",
"args": ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"],
"env": {
"CLOUDINARY_CLOUD_NAME": "cloud_name",
"CLOUDINARY_API_KEY": "api_key",
"CLOUDINARY_API_SECRET": "api_secret"
}
}
}
}
Option 2: Using command line arguments
{
"mcpServers": {
"cloudinary-asset-mgmt": {
"command": "npx",
"args": [
"-y", "--package", "@cloudinary/asset-management",
"--",
"mcp", "start",
"--cloud-name", "cloud_name",
"--api-key", "api_key",
"--api-secret", "api_secret"
]
}
}
}
Option 3: Using CLOUDINARY_URL environment variable
{
"mcpServers": {
"cloudinary-asset-mgmt": {
"command": "npx",
"args": ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"],
"env": {
"CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name"
}
}
}
}
Apply the same configuration pattern to all other servers by replacing @cloudinary/asset-management
with the respective package names:
@cloudinary/environment-config
@cloudinary/structured-metadata
@cloudinary/analysis
MediaFlows MCP Server Configuration
For MediaFlows, use the following configuration:
{
"mcpServers": {
"mediaflows": {
"url": "https://mediaflows.mcp.cloudinary.com/v2/mcp",
"headers": {
"cld-cloud-name": "cloud_name",
"cld-api-key": "api_key",
"cld-secret": "api_secret"
}
}
}
}
Using Cloudinary's MCP servers with OpenAI Responses API
OpenAI's Responses API allows you to integrate MCP servers directly into your OpenAI API calls, enabling AI models to access Cloudinary's media management capabilities in real-time.
Setup Overview
- Install the MCP server: Use Cursor deeplinks or manual configuration
- Configure authentication: Provide your Cloudinary credentials
- Add server to your OpenAI API request: Include MCP server configuration in your API call
- Use Cloudinary tools: The AI model can now call Cloudinary functions during the conversation
Single Server Configuration
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
role: "user",
content: "Analyze the content of my uploaded images"
}
],
tools: [
{
type: "mcp_server",
mcp_server: {
name: "cloudinary-analysis",
command: "npx",
args: ["-y", "--package", "@cloudinary/analysis", "--", "mcp", "start"],
env: {
"CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name"
}
}
}
]
});
Multiple Server Configuration
You can use multiple Cloudinary MCP servers in a single API call:
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
role: "user",
content: "Upload these images, analyze their content, and create structured metadata"
}
],
tools: [
{
type: "mcp_server",
mcp_server: {
name: "cloudinary-asset-mgmt",
command: "npx",
args: ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"],
env: {
"CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name"
}
}
},
{
type: "mcp_server",
mcp_server: {
name: "cloudinary-analysis",
command: "npx",
args: ["-y", "--package", "@cloudinary/analysis", "--", "mcp", "start"],
env: {
"CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name"
}
}
},
{
type: "mcp_server",
mcp_server: {
name: "cloudinary-smd",
command: "npx",
args: ["-y", "--package", "@cloudinary/structured-metadata", "--", "mcp", "start"],
env: {
"CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name"
}
}
}
]
});
Authentication
When running MCP servers locally, authentication can be configured in several ways:
Option 1: Individual environment variables (Recommended)
export CLOUDINARY_CLOUD_NAME="cloud_name"
export CLOUDINARY_API_KEY="api_key"
export CLOUDINARY_API_SECRET="api_secret"
Option 2: CLOUDINARY_URL environment variable
export CLOUDINARY_URL="cloudinary://api_key:api_secret@cloud_name"
Option 3: Command line arguments
Pass credentials directly as arguments (see configuration examples above)
You can find your Cloudinary credentials in your Cloudinary Console Dashboard under Settings > Security.
Features by Server
Asset Management Server
- Upload and manage media assets (images, videos, raw files)
- Search and organize assets with advanced filtering capabilities
- Handle asset operations and transformations
- Manage folders, tags, and asset relationships
- Generate archives and download links
Environment Config Server
- Configure upload presets and transformation settings
- Manage streaming profiles and webhook notifications
- Set up upload mappings
Structured Metadata Server
- Create and manage structured metadata fields
- Configure conditional metadata rules and validation
- Organize and search metadata configurations
- Handle metadata field relationships and ordering
Analysis Server
- AI-powered content analysis including tagging, moderation, and captioning
- Object detection and recognition with multiple AI models
- Image quality analysis and watermark detection
- Content moderation and safety analysis
- Fashion, text, and anatomy detection capabilities
MediaFlows Server
- Build and manage workflow automations using natural language
- Query existing PowerFlow automations in your environment
- Create conditional logic based on metadata, tags, and asset properties
- Automate asset moderation, approval, and notification workflows
- Debug and understand existing automation configurations
Need access to more Cloudinary tools?
We're continuing to add more functionality to these MCP servers. If you'd like to leave feedback, file a bug or provide a feature request, please open an issue on this repository.
Troubleshooting
"Claude's response was interrupted..."
If you see this message, Claude likely hit its context-length limit and stopped mid-reply. This happens most often on servers that trigger many chained tool calls such as the asset management server with large asset listings.
To reduce the chance of running into this issue:
- Try to be specific, keep your queries concise.
- If a single request calls multiple tools, try to break it into several smaller tool calls to keep the responses short.
- Use filtering parameters to limit the scope of asset searches and listings.
Authentication Issues
Ensure your Cloudinary credentials are correctly configured and have the necessary permissions for the operations you're trying to perform.
Paid Features
Some features may require a paid Cloudinary plan. Ensure your Cloudinary account has the necessary subscription level for the features you intend to use, such as:
- Advanced AI analysis features
- High-volume API usage
- Custom metadata fields
- Advanced transformation capabilities
License
Licensed under the MIT License. See LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.