Graphiti MCP

Graphiti MCP

Provides persistent memory and context continuity for AI agents using Zep's Graphiti and Neo4j graph database. Enables storing, retrieving, and linking memories to build a knowledge graph accessible across Cursor and Claude.

Category
Visit Server

README

Graphiti MCP Demo

We are implementing an MCP server and AI agent integration to leverage Zep's Graphiti for persistent memory and context continuity across Cursor and Claude. This will allow AI agents hosted on Cursor and Claude to connect to the MCP for dynamic tool discovery, select the optimal tool for a given query, and formulate responses informed by past interactions, all while Graphiti ensures consistent context across both client platforms.

We use:

  • Graphiti by Zep AI as a memory layer for an AI agent
  • Cursor and Claude (as MCP Hosts)

Set Up

Follow these steps to set up the project before running the MCP server.

Prerequisites

  • Python 3.10 or higher
  • uv package manager (recommended) or pip
  • Neo4j database (use free Neo4j Aura cloud instance or any Neo4j instance)
  • OpenRouter API key (recommended) or OpenAI API key

Install Dependencies

Using uv (recommended):

uv sync

Or using pip:

pip install -e .

Or install dependencies directly:

pip install mcp neo4j openai python-dotenv

Configuration

Before running the MCP server, you need to configure the environment variables.

  1. Copy the example environment file:

    cp .env.example .env
    
  2. Edit the .env file with your actual credentials:

    • Replace NEO4J_URI with your Neo4j connection string
    • Replace NEO4J_PASSWORD with your Neo4j password
    • Replace <your_openrouter_api_key> with your OpenRouter API key (or use OPENAI_API_KEY for OpenAI)
    • Adjust MODEL_NAME if needed

See .env.example for the complete configuration template.

Important:

  • Neo4j Setup: Get a free Neo4j Aura instance at https://neo4j.com/cloud/aura/ (recommended) or use any Neo4j instance
  • For Neo4j Aura, use the URI format neo4j+s://xxxxx.databases.neo4j.io (note the +s for secure connection)
  • Replace <your_openrouter_api_key> with your actual OpenRouter API key (get one at https://openrouter.ai)
  • Or use OPENAI_API_KEY if you prefer to use OpenAI directly
  • Model names for OpenRouter should be in format provider/model-name (e.g., openai/gpt-4o-mini, anthropic/claude-3-haiku)

Use MCP Server

Run MCP Server

Simple Method (Recommended):

Use the provided script:

.\run-server.ps1

Or run directly:

For Cursor (SSE transport):

# Using uv
uv run graphiti_mcp_server.py --transport sse --port 8000

# Or using python directly
python graphiti_mcp_server.py --transport sse --port 8000

For Claude (stdio transport):

uv run graphiti_mcp_server.py --transport stdio

The server will connect to your Neo4j instance (configured in .env) and start listening for connections.

Note: Make sure your .env file has the correct Neo4j connection details before starting the server.

Available Tools

The MCP server provides the following tools:

  1. store_memory: Store a memory or context in the graph database for future retrieval
  2. retrieve_memories: Retrieve relevant memories from the graph database based on a query
  3. create_relationship: Create a relationship between two memories or entities in the graph
  4. get_context: Get contextual information for a given query by retrieving and synthesizing relevant memories
  5. search_graph: Search the graph database using Cypher query

Web UI Demo

A beautiful web interface is available to interact with the Graphiti MCP Server:

Start the Web UI:

.\run-web-ui.ps1

Or directly:

python web_ui_server.py

Then open your browser to: http://localhost:8081 (or the port specified)

The web UI provides:

  • Store Memory: Add new memories with tags and metadata
  • Retrieve Memories: Search for relevant memories using semantic search
  • Get Context: Get synthesized context from multiple memories
  • Create Relationships: Link memories together in the knowledge graph
  • Search Graph: Execute custom Cypher queries
  • Browse All: View all stored memories

Need example values? See WEB_UI_EXAMPLES.md for ready-to-use examples for each form!

Example Usage

For comprehensive examples and use cases, see:

  • EXAMPLE_USAGE.md: Detailed examples showing how to use each tool with real-world scenarios
  • example_usage.py: Python script demonstrating programmatic usage of the MCP server tools

To run the example script:

python example_usage.py

Integrate MCP Clients

Cursor Configuration

Create or modify the mcp.json file in your Cursor configuration directory with the following content:

{
  "mcpServers": {
    "Graphiti": {
      "url": "http://localhost:8000/sse"
    }
  }
}

Note: The exact location of the mcp.json file depends on your Cursor installation. Typically, it's in:

  • Windows: %APPDATA%\Cursor\User\globalStorage\mcp.json
  • macOS: ~/Library/Application Support/Cursor/User/globalStorage/mcp.json
  • Linux: ~/.config/Cursor/User/globalStorage/mcp.json

Claude Desktop Configuration

Create or modify the claude_desktop_config.json file (typically located at ~/Library/Application Support/Claude/claude_desktop_config.json on macOS, or similar paths on other platforms) with the following content:

{
  "mcpServers": {
    "graphiti": {
      "transport": "stdio",
      "command": "uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/path/to/graphiti_mcp",
        "--project",
        ".",
        "graphiti_mcp_server.py",
        "--transport",
        "stdio"
      ]
    }
  }
}

Important: Update the --directory path to match your actual project directory path.

Alternatively, if you have uv in your PATH, you can use:

{
  "mcpServers": {
    "graphiti": {
      "transport": "stdio",
      "command": "uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/path/to/graphiti_mcp",
        "graphiti_mcp_server.py",
        "--transport",
        "stdio"
      ]
    }
  }
}

Architecture

The Graphiti MCP server uses:

  • Neo4j: Graph database for storing memories and relationships
  • OpenRouter/OpenAI: For generating embeddings and synthesizing context (OpenRouter recommended for access to multiple models)
  • MCP Protocol: For communication with AI agent hosts (Cursor, Claude)

Memories are stored as nodes in Neo4j with:

  • Content (text)
  • Embeddings (vector representations)
  • Metadata (optional key-value pairs)
  • Tags (for categorization)
  • Timestamps

Relationships between memories can be created to build a knowledge graph.

Troubleshooting

Connection Issues

  • Neo4j Connection Error: Ensure Neo4j is running and accessible at the configured URI
  • OpenRouter/OpenAI API Error: Verify your API key is correct and has sufficient credits
    • For OpenRouter: Check your API key at https://openrouter.ai/keys
    • For OpenAI: Check your API key at https://platform.openai.com/api-keys
  • MCP Server Not Starting: Check that all dependencies are installed correctly

Neo4j Connection Issues

  • Neo4j Connection Error:

    • Ensure your Neo4j instance is running and accessible
    • For Neo4j Aura: Check that your IP is whitelisted in Aura settings
    • Verify the URI format: Use neo4j+s:// for Aura, bolt:// for local instances
    • Test connection manually using Neo4j Browser or cypher-shell
  • Connection Timeout:

    • Check your firewall settings
    • Verify the Neo4j URI, username, and password in .env
    • For Aura, ensure your IP address is whitelisted in the Aura dashboard

Performance

  • For better vector search performance, consider setting up Neo4j's vector index
  • The current implementation uses a simplified similarity search; for production, use Neo4j's Graph Data Science library

Development

Running Tests

pytest

Code Formatting

black graphiti_mcp_server.py
ruff check graphiti_mcp_server.py

License

This project is open source and available under the MIT License.

Contribution

Contributions are welcome! Feel free to fork this repository and submit pull requests with your improvements.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured