OpenSearch MCP Server
Provides a semantic memory layer that integrates LLMs with OpenSearch, enabling storage and retrieval of memories within the OpenSearch engine.
ibrooksSDX
README
mcp-server-opensearch: An OpenSearch MCP Server
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you’re building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.
This repository is an example of how to create a MCP server for OpenSearch, a distributed search and analytics engine.
Under Contruction
Current Blocker - Async Client from OpenSearch isn't installing
pip install opensearch-py[async]
zsh: no matches found: opensearch-py[async]
Overview
A basic Model Context Protocol server for keeping and retrieving memories in the OpenSearch engine. It acts as a semantic memory layer on top of the OpenSearch database.
Components
Tools
search-openSearch
- Store a memory in the OpenSearch database
- Input:
query
(json): prepared json query message
- Returns: Confirmation message
Installation
Installing via Smithery
To install mcp-server-opensearch for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @ibrooksSDX/mcp-server-opensearch --client claude
Using uv (recommended)
When using uv
no specific installation is needed to directly run mcp-server-opensearch.
uv run mcp-server-opensearch \
--opensearch-url "http://localhost:9200" \
--index-name "my_index" \
or
uv run fastmcp run demo.py:main
Testing - Local Open Search Client
uv run python src/mcp-server-opensearch/test_opensearch.py
Testing - MCP Server Connection to Open Search Client
cd src/mcp-server-opensearch
uv run fastmcp dev demo.py
Usage with Claude Desktop
To use this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json
:
{
"opensearch": {
"command": "uvx",
"args": [
"mcp-server-opensearch",
"--opensearch-url",
"http://localhost:9200",
"--opensearch-api-key",
"your_api_key",
"--index-name",
"your_index_name"
]
}, "Demo": {
"command": "uv",
"args": [
"run",
"--with",
"fastmcp",
"--with",
"opensearch-py",
"fastmcp",
"run",
"/Users/ibrooks/Documents/GitHub/mcp-server-opensearch/src/mcp-server-opensearch/demo.py"
]
}
}
Or use the FastMCP UI to install the server to Claude
uv run fastmcp install demo.py
Environment Variables
The configuration of the server can be also done using environment variables:
OPENSEARCH_HOST
: URL of the OpenSearch server, e.g.http://localhost
OPENSEARCH_HOSTPORT
: Port of the host of the OpenSearch server9200
INDEX_NAME
: Name of the index to use
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to
React MCP
react-mcp integrates with Claude Desktop, enabling the creation and modification of React apps based on user prompts
Atlassian Integration
Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.

Any OpenAI Compatible API Integrations
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.