Outline Wiki MCP Server

Outline Wiki MCP Server

Enables LLMs to interact with Outline wiki for document management, search, and collections, with optional AI-powered features including RAG-based Q\&A, semantic search, and document summarization.

Category
Visit Server

README

Outline Wiki MCP Server

npm version License: MIT

English | 한국어 | 日本語 | 中文

A Model Context Protocol (MCP) server that enables LLMs to interact with Outline wiki through structured API calls. This server provides document management, search, collections, comments, and AI-powered smart features including RAG-based Q&A.

Why This Server?

Most Outline MCP servers provide basic API wrappers. This one adds optional Smart Features:

Feature What it does
ask_wiki Ask questions in natural language, get answers based on your wiki content (RAG)
find_related Find semantically similar documents, not just keyword matches
summarize_document Generate summaries of long documents
suggest_tags Get tag suggestions based on content analysis

When you might need this:

  • Your team's wiki has grown large and search isn't enough
  • You want to query your documentation conversationally
  • You need semantic search across your knowledge base

When basic MCP is sufficient:

  • You only need CRUD operations on documents
  • You don't want to set up OpenAI API
  • Your wiki is small and well-organized

Smart features require ENABLE_SMART_FEATURES=true and an OpenAI API key. Without these, the server works as a standard Outline MCP.

Example Usage

User: "What's our policy on remote work?"
→ ask_wiki searches your wiki and returns an answer with source links

User: "Find documents related to the onboarding guide"
→ find_related returns semantically similar docs (not just keyword matches)

User: "Summarize the Q4 planning document"
→ summarize_document generates a concise summary in your preferred language

Supported Clients

Client Tools Resources Prompts
Claude Desktop
Claude Code
VS Code GitHub Copilot
Cursor -
Windsurf - -
ChatGPT Desktop - -

Getting Started

Requirements

  • Node.js 18.0.0 or higher
  • Outline instance with API access
  • (Optional) OpenAI API key for smart features

Getting Your Outline API Token

  1. Log in to your Outline instance
  2. Go to SettingsAPI
  3. Click Create API Key
  4. Copy the generated token (starts with ol_api_)

Installation

<details> <summary>Claude Desktop</summary>

Add to your Claude Desktop configuration:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Claude Code</summary>

Run the following command:

claude mcp add outline -e OUTLINE_URL=https://your-outline-instance.com -e OUTLINE_API_TOKEN=ol_api_xxxxxxxxxxxxx -- npx -y outline-smart-mcp

Or add to ~/.claude.json (global) or .mcp.json (project-local):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

Note: The ~/.claude/settings.json file is ignored for MCP servers. Use ~/.claude.json or .mcp.json instead.

</details>

<details> <summary>VS Code GitHub Copilot</summary>

Add to your VS Code settings (.vscode/mcp.json):

{
  "servers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Cursor</summary>

Add to Cursor MCP settings (~/.cursor/mcp.json):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Windsurf</summary>

Add to Windsurf MCP settings (~/.codeium/windsurf/mcp_config.json):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>ChatGPT Desktop</summary>

ChatGPT supports MCP through its desktop app. Add the server in SettingsMCP Servers with:

  • Command: npx
  • Arguments: -y outline-smart-mcp
  • Environment variables as shown above

</details>

Configuration

Environment Variables

Variable Description Required Default
OUTLINE_URL Your Outline instance URL Yes https://app.getoutline.com
OUTLINE_API_TOKEN Your Outline API token Yes -
READ_ONLY Enable read-only mode No false
DISABLE_DELETE Disable delete operations No false
MAX_RETRIES API retry attempts No 3
RETRY_DELAY_MS Retry delay (ms) No 1000
ENABLE_SMART_FEATURES Enable AI features No false
OPENAI_API_KEY OpenAI API key No* -

* Required when ENABLE_SMART_FEATURES=true

Smart Features Configuration

To enable AI-powered features (RAG Q&A, summarization, etc.), add these to your config:

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx",
        "ENABLE_SMART_FEATURES": "true",
        "OPENAI_API_KEY": "sk-xxxxxxxxxxxxx"
      }
    }
  }
}

Tools

Search & Discovery

Tool Description
search_documents Search documents by keyword with pagination
get_document_id_from_title Find document ID by title
list_collections Get all collections
get_collection_structure Get document hierarchy in a collection
list_recent_documents Get recently modified documents

Document Operations

Tool Description
get_document Get full document content by ID
export_document Export document in Markdown
create_document Create a new document
update_document Update document (supports append)
move_document Move document to another location

Document Lifecycle

Tool Description
archive_document Archive a document
unarchive_document Restore archived document
delete_document Delete document (soft/permanent)
restore_document Restore from trash
list_archived_documents List archived documents
list_trash List trashed documents

Comments & Collaboration

Tool Description
add_comment Add comment (supports replies)
list_document_comments Get document comments
get_comment Get specific comment
get_document_backlinks Find linking documents

Collection Management

Tool Description
create_collection Create collection
update_collection Update collection
delete_collection Delete collection
export_collection Export collection
export_all_collections Export all collections

Batch Operations

Tool Description
batch_create_documents Create multiple documents
batch_update_documents Update multiple documents
batch_move_documents Move multiple documents
batch_archive_documents Archive multiple documents
batch_delete_documents Delete multiple documents

Smart Features (AI-Powered)

Requires ENABLE_SMART_FEATURES=true and OPENAI_API_KEY.

Tool Description
smart_status Check status and indexed count
sync_knowledge Sync docs to vector database
ask_wiki RAG-based Q&A on wiki content
summarize_document Generate AI summary
suggest_tags AI-suggested tags
find_related Find semantically related docs
generate_diagram Generate Mermaid diagrams

Smart Features Usage

# 1. First, sync your wiki documents
sync_knowledge

# 2. Ask questions about your wiki
ask_wiki: "What is our deployment process?"

# 3. Summarize long documents
summarize_document: { documentId: "doc-id", language: "Korean" }

# 4. Find related content
find_related: { documentId: "doc-id", limit: 5 }

Technology Stack

Component Technology
Vector Database LanceDB (embedded)
Embeddings OpenAI text-embedding-3-small
LLM GPT-4o-mini
Text Chunking LangChain

Safety Features

Read-Only Mode

READ_ONLY=true

Restricts to read operations only: search, get, export, list operations, and all smart features.

Disable Delete

DISABLE_DELETE=true

Blocks: delete_document, delete_collection, batch_delete_documents

Development

# Clone repository
git clone https://github.com/huiseo/outline-wiki-mcp.git
cd outline-wiki-mcp

# Install dependencies
npm install

# Build
npm run build

# Run tests
npm test

# Type check
npm run typecheck

License

MIT License - see LICENSE for details.

Links

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured