MemGPT

MemGPT

A TypeScript-based server that provides a memory system for Large Language Models (LLMs), allowing users to interact with multiple LLM providers while maintaining conversation history and offering tools for managing providers and model configurations.

Vic563

AI Memory Systems
AI Integration Systems
Visit Server

README

MemGPT MCP Server

A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.

Features

Tools

  • chat - Send a message to the current LLM provider

    • Takes a message parameter
    • Supports multiple providers (OpenAI, Anthropic, OpenRouter, Ollama)
  • get_memory - Retrieve conversation history

    • Optional limit parameter to specify number of memories to retrieve
    • Pass limit: null for unlimited memory retrieval
    • Returns memories in chronological order with timestamps
  • clear_memory - Clear conversation history

    • Removes all stored memories
  • use_provider - Switch between different LLM providers

    • Supports OpenAI, Anthropic, OpenRouter, and Ollama
    • Persists provider selection
  • use_model - Switch to a different model for the current provider

    • Supports provider-specific models:
      • Anthropic Claude Models:
        • Claude 3 Series:
          • claude-3-haiku: Fastest response times, ideal for tasks like customer support and content moderation
          • claude-3-sonnet: Balanced performance for general-purpose use
          • claude-3-opus: Advanced model for complex reasoning and high-performance tasks
        • Claude 3.5 Series:
          • claude-3.5-haiku: Enhanced speed and cost-effectiveness
          • claude-3.5-sonnet: Superior performance with computer interaction capabilities
      • OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
      • OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
      • Ollama: Any locally available model (e.g., 'llama2', 'codellama')
    • Persists model selection

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Installation

To use with Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "letta-memgpt": {
      "command": "/path/to/memgpt-server/build/index.js",
      "env": {
        "OPENAI_API_KEY": "your-openai-key",
        "ANTHROPIC_API_KEY": "your-anthropic-key",
        "OPENROUTER_API_KEY": "your-openrouter-key"
      }
    }
  }
}

Environment Variables

  • OPENAI_API_KEY - Your OpenAI API key
  • ANTHROPIC_API_KEY - Your Anthropic API key
  • OPENROUTER_API_KEY - Your OpenRouter API key

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

Recent Updates

Claude 3 and 3.5 Series Support (March 2024)

  • Added support for latest Claude models:
    • Claude 3 Series (Haiku, Sonnet, Opus)
    • Claude 3.5 Series (Haiku, Sonnet)

Unlimited Memory Retrieval

  • Added support for retrieving unlimited conversation history
  • Use { "limit": null } with the get_memory tool to retrieve all stored memories
  • Use { "limit": n } to retrieve the n most recent memories
  • Default limit is 10 if not specified

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
AIO-MCP Server

AIO-MCP Server

🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from

Featured
Local
Persistent Knowledge Graph

Persistent Knowledge Graph

An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.

Featured
Local
React MCP

React MCP

react-mcp integrates with Claude Desktop, enabling the creation and modification of React apps based on user prompts

Featured
Local
Atlassian Integration

Atlassian Integration

Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.

Featured
Any OpenAI Compatible API Integrations

Any OpenAI Compatible API Integrations

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

Featured
MySQL Server

MySQL Server

Allows AI assistants to list tables, read data, and execute SQL queries through a controlled interface, making database exploration and analysis safer and more structured.

Featured
Browser Use (used by Deploya.dev)

Browser Use (used by Deploya.dev)

AI-driven browser automation server that implements the Model Context Protocol to enable natural language control of web browsers for tasks like navigation, form filling, and visual interaction.

Featured