Dify AI Server

Dify AI Server

Enables LLMs to interact with Dify AI's chat completion API, including conversation context support and a restaurant recommendation tool.

yuru-sha

Remote Shell Execution
Database Interaction
AI Integration Systems
Visit Server

README

mcp-server-dify

CI Status

Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.

Features

  • Integration with Dify AI chat completion API
  • Restaurant recommendation tool (meshi-doko)
  • Support for conversation context
  • Streaming response support
  • TypeScript implementation

Installation

Using NPM

npm install @modelcontextprotocol/server-dify

Usage

With Claude Desktop

Add the following configuration to your claude_desktop_config.json:

{
  "mcpServers": {
    "dify": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-dify",
        "https://your-dify-api-endpoint",
        "your-dify-api-key"
      ]
    }
  }
}

Replace your-dify-api-endpoint and your-dify-api-key with your actual Dify API credentials.

Tools

meshi-doko

Restaurant recommendation tool that interfaces with Dify AI:

Parameters:

  • LOCATION (string): Location of the restaurant
  • BUDGET (string): Budget constraints
  • query (string): Query to send to Dify AI
  • conversation_id (string, optional): For maintaining chat context

Development

# Initial setup
make setup

# Build the project
make build

# Format code
make format

# Run linter
make lint

License

This project is released under the MIT License.

Security

This server interacts with Dify AI using your provided API key. Ensure to:

  • Keep your API credentials secure
  • Use HTTPS for the API endpoint
  • Never commit API keys to version control

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
AIO-MCP Server

AIO-MCP Server

🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from

Featured
Local
Persistent Knowledge Graph

Persistent Knowledge Graph

An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.

Featured
Local
Hyperbrowser MCP Server

Hyperbrowser MCP Server

Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to

Featured
Local
React MCP

React MCP

react-mcp integrates with Claude Desktop, enabling the creation and modification of React apps based on user prompts

Featured
Local
Atlassian Integration

Atlassian Integration

Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.

Featured