IMS MCP Server
Exposes the Integrated Memory System (IMS) capabilities, including session management, memory storage, and RAG-based context search, via the Model Context Protocol. It allows MCP-aware clients to interact with IMS backends to maintain long-term memory and context across sessions.
README
IMS MCP Server
MCP server that exposes the Integrated Memory System (IMS) as tools via the Model Context Protocol Python SDK.
It wraps the existing IMS HTTP backend (session-memory, memory-core, context-rag) and makes those capabilities available to MCP-aware clients (e.g. mcphub, Warp, VS Code, LibreChat).
Prerequisites
- Python 3.10+
- An IMS backend running somewhere reachable (FastAPI/Uvicorn service), e.g.:
http://localhost:8000, orhttp://ims.delongpa.com
- The
integrated-memory-systemrepo checked out on disk in this layout (relative to this project):
<some-parent-dir>/
skills/
integrated-memory-system/ # IMS FastAPI project (provides IMSClient)
ims-mcp/ # this repo
server.py imports IMSClient from skills/integrated-memory-system/app/ims_client.py
using a relative path; if your layout is different, adjust server.py accordingly.
Installation (venv + pip)
From the ims-mcp directory:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
This installs the official MCP Python SDK (mcp[cli]).
Configuration
The MCP server talks to IMS via environment variables. These can be provided in three ways (in order of increasing precedence):
- A local
.envfile in the project root (or a path specified byIMS_ENV_FILE) - The process environment (e.g. exported variables in your shell)
- Environment variables set by the MCP host (e.g. mcphub
envblock)
Supported variables:
IMS_BASE_URL(required)- Base URL of the IMS HTTP service, e.g.
http://localhost:8000orhttps://ims.delongpa.com.
- Base URL of the IMS HTTP service, e.g.
IMS_HTTP_TIMEOUT(optional, default5.0seconds)IMS_CLIENT_NAME(optional, default"ims-mcp")IMS_ENV_FILE(optional, default.env)- If set, points to a
.env-style file to load before reading other vars.
- If set, points to a
Using a .env file (local development)
Create a file named .env next to server.py:
IMS_BASE_URL=http://localhost:8000
IMS_HTTP_TIMEOUT=5.0
IMS_CLIENT_NAME=ims-mcp
You can override the file name/path with IMS_ENV_FILE if needed.
Setting variables directly
Example using exported variables:
export IMS_BASE_URL="http://ims.delongpa.com"
export IMS_HTTP_TIMEOUT="5.0"
export IMS_CLIENT_NAME="ims-mcp"
Running the MCP server locally
With the venv activated and IMS_BASE_URL set:
source .venv/bin/activate
export IMS_BASE_URL="http://localhost:8000" # or your IMS URL
python server.py
The server runs over stdio, which is what MCP clients expect when they spawn it as a subprocess.
mcphub configuration example
To use this server from mcphub on a host where you cloned this repo to
/opt/mcps/ims-mcp and created the venv as above, add an entry like:
"IMS-MCP": {
"type": "stdio",
"command": "/opt/mcps/ims-mcp/.venv/bin/python",
"args": [
"/opt/mcps/ims-mcp/server.py"
],
"env": {
"IMS_BASE_URL": "http://ims.delongpa.com"
}
}
Adjust paths and IMS_BASE_URL to match your environment.
Exposed tools
The MCP server exposes the following tools (namespaces follow the IMS service names):
ims.context-rag.context_search- Wrapper over
POST /context/search.
- Wrapper over
ims.memory-core.store_memory- Wrapper over
POST /memories/store.
- Wrapper over
ims.memory-core.find_memories- Wrapper over
POST /memories/search.
- Wrapper over
ims.session-memory.auto_session- Wrapper over
POST /sessions/auto.
- Wrapper over
ims.session-memory.continue_session- Wrapper over
POST /sessions/continue.
- Wrapper over
ims.session-memory.wrap_session- Wrapper over
POST /sessions/wrap.
- Wrapper over
ims.session-memory.list_open_sessions- Wrapper over
POST /sessions/list_open.
- Wrapper over
ims.session-memory.resume_session- Wrapper over
POST /sessions/resume.
- Wrapper over
For detailed behavior of these endpoints, see spec/API_ENDPOINTS.md in the
integrated-memory-system repo and AGENTS.md in this repo for the IMS
agent protocol.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.