RefMD MCP Server
Enables chatbots to browse, search, create, and edit Markdown documents stored in RefMD through a hosted SSE endpoint with OAuth 2.1 authentication.
README
RefMD MCP Server
A Model Context Protocol server that exposes RefMD documents over a hosted SSE endpoint, so chatbots can browse and edit Markdown through RefMD's API.
Features
- Streamable HTTP/SSE transport compatible with hosted MCP clients (e.g. Claude, Cursor)
- Resource template
refmd://document/{id}to read document Markdown and metadata - Tools for listing, searching, creating, and updating documents via the RefMD API
Prerequisites
- Node.js 18+
- Access to an existing RefMD instance
- Either a personal access token (JWT) or credentials that can log in via
/api/auth/login
Configuration
The server now authenticates users via OAuth 2.1 with PKCE. Configure it with the following variables:
| Variable | Description |
|---|---|
REFMD_API_BASE |
Required. Base URL of your RefMD API (e.g. https://refmd.example.com). |
OAUTH_CLIENT_IDS |
Comma-separated list of allowed client_id values. Leave empty to allow any. |
OAUTH_ALLOWED_REDIRECTS |
Comma-separated list of allowed redirect URIs. Defaults to allowing HTTPS URLs and http://localhost. Include ChatGPT’s callback URL (https://chat.openai.com/aip/mcp/oauth/callback). |
OAUTH_ISSUER |
Optional public issuer URL (defaults to the current request origin). |
ACCESS_TOKEN_TTL_SECONDS |
Optional access-token lifetime (default 3600). |
REFRESH_TOKEN_TTL_SECONDS |
Optional refresh-token lifetime (default 2592000, i.e. 30 days). |
MCP_DB_DRIVER |
Optional. Set to sqlite, postgres, or mysql to persist OAuth tokens. Defaults to in-memory storage. |
MCP_DB_URL |
Optional connection string for the configured driver (e.g. postgres://user:pass@host/db). For SQLite you may omit this and use MCP_DB_SQLITE_PATH. |
MCP_DB_SQLITE_PATH |
Optional filesystem path for SQLite storage (defaults to ./data/refmd-mcp.sqlite). Accepts plain paths or file:/// URLs; ensure the path resolves to persistent storage when using SQLite. |
PORT / HOST |
Optional listen port / host (defaults: 3334 / 0.0.0.0). |
ℹ️ Install the appropriate database driver when enabling persistence:
npm install better-sqlite3for SQLite,npm install pgfor PostgreSQL, ornpm install mysql2for MySQL/MariaDB.
Install & Build
cd mcp-server
npm install
npm run build
Run
npm start
REFMD_API_BASE="https://refmd.example.com" \
OAUTH_CLIENT_IDS="chatgpt-connector" \
OAUTH_ALLOWED_REDIRECTS="https://chat.openai.com/aip/mcp/oauth/callback" \
npm start
The server exposes two transports:
http://<host>:<port>/sse— SSE transport (compatible with Claude SSE etc.)http://<host>:<port>/mcp— Streamable HTTP transport (one-shot POST per exchange)
OAuth flow
- Configure your client (e.g. ChatGPT custom connector) with:
- Authorization URL:
https://your-domain.example.com/oauth/authorize - Token URL:
https://your-domain.example.com/oauth/token - Revocation URL:
https://your-domain.example.com/oauth/revoke - Scopes: (leave blank)
- PKCE: enabled (ChatGPT uses S256 automatically)
- Authorization URL:
- When prompted, the browser shows the RefMD MCP consent page. Paste a RefMD API token generated from Profile → API tokens and approve.
- The connector receives an access token and can call
/sseor/mcpwithAuthorization: Bearer <token>.
Tokens can be revoked from RefMD (profile page) or via POST /oauth/revoke.
Run with Docker
# Build image
docker build -t refmd-mcp .
docker run -p 3334:3334 \
-e REFMD_API_BASE="https://refmd.example.com" \
-e OAUTH_CLIENT_IDS="chatgpt-connector" \
-e OAUTH_ALLOWED_REDIRECTS="https://chat.openai.com/aip/mcp/oauth/callback" \
-e MCP_DB_DRIVER="sqlite" \
-e MCP_DB_SQLITE_PATH="/data/refmd-mcp.sqlite" \
-v refmd-mcp-data:/data \
refmd-mcp
Mount a persistent volume (as shown above) so the SQLite database file survives container restarts.
Connecting a Chat Client
- Claude (CLI):
claude mcp add --transport sse refmd https://your-domain.example.com/sse - Cursor / VS Code / MCP Inspector: choose an SSE transport and supply the same URL.
Once connected, resources appear under refmd://document/{id}. Available tools include refmd-list-documents, refmd-search-documents, refmd-create-document, refmd-read-document, refmd-update-document-content, and more.
Release workflow
The GitHub Actions workflow CI MCP Server ships the container image. It runs automatically on pushes/PRs touching mcp-server and publishes to GHCR when:
- the push is a tag matching
mcp-server-v*(versioned release), or - the workflow is manually triggered with
publish=true.
Tags published to ghcr.io/<owner>/refmd-mcp include semantic versions (1.2.0, 1.2, 1), the raw git tag, and latest. Use the extra-tag input for additional labels when invoking the workflow manually.
Development
Run in watch mode with TSX:
npm run dev
Any code changes require a rebuild (npm run build) before deploying or running with npm start.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.