TiddlyWiki MCP Server
Provides AI assistants with access to TiddlyWiki wikis via HTTP API, supporting tiddler management (create, update, delete, search) and semantic search capabilities using Ollama embeddings for natural language queries.
README
TiddlyWiki MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with access to TiddlyWiki wikis via the HTTP API. Supports semantic search using Ollama embeddings.
Features
MCP Tools
- search_tiddlers - Search tiddlers using TiddlyWiki filter syntax, semantic similarity, or hybrid (both combined)
- create_tiddler - Create new tiddlers with custom fields
- update_tiddler - Update existing tiddlers with diff preview
- delete_tiddler - Delete tiddlers with content preview
MCP Resources
- filter-reference://syntax - Complete TiddlyWiki filter syntax reference
Semantic Search
When Ollama is available, the server provides semantic search capabilities:
- Natural language queries find conceptually related tiddlers
- Uses
nomic-embed-textembeddings model - SQLite-vec for efficient vector similarity search
- Background sync keeps embeddings up-to-date
- Hybrid mode combines filter results with semantic reranking
Requirements
- Node.js 22+
- TiddlyWiki with HTTP API enabled (e.g., TiddlyWiki on Node.js with
listencommand) - Ollama (optional, for semantic search)
Build Prerequisites
This project uses native SQLite modules that require compilation. You'll need:
- Linux:
build-essential, Python 3 - macOS: Xcode Command Line Tools (
xcode-select --install) - Windows: Visual Studio Build Tools, Python 3
Installation
From npm (recommended)
TIDDLYWIKI_URL=http://localhost:8080 npx tiddlywiki-mcp-server
Or install globally:
npm install -g tiddlywiki-mcp-server
TIDDLYWIKI_URL=http://localhost:8080 tiddlywiki-mcp-server
From source
git clone https://github.com/ppetru/tiddlywiki-mcp.git
cd tiddlywiki-mcp
npm install
npm run build
Quick Start
1. Start TiddlyWiki with HTTP API
# Install TiddlyWiki if you haven't already
npm install -g tiddlywiki
# Create a new wiki and start it with HTTP API
tiddlywiki mywiki --init server
tiddlywiki mywiki --listen port=8080
2. (Optional) Set up Ollama for Semantic Search
# Install Ollama from https://ollama.ai
# Then pull the embedding model:
ollama pull nomic-embed-text
3. Start the MCP Server
TIDDLYWIKI_URL=http://localhost:8080 npx tiddlywiki-mcp-server
Configuration
All configuration is via environment variables. See .env.example for a complete reference.
Required
| Variable | Description |
|---|---|
TIDDLYWIKI_URL |
URL of your TiddlyWiki server (e.g., http://localhost:8080) |
Optional
| Variable | Default | Description |
|---|---|---|
MCP_TRANSPORT |
stdio |
Transport mode: stdio or http |
MCP_PORT |
3000 |
HTTP server port (when using http transport) |
OLLAMA_URL |
http://localhost:11434 |
Ollama API URL |
OLLAMA_MODEL |
nomic-embed-text |
Embedding model name |
EMBEDDINGS_ENABLED |
true |
Enable/disable semantic search |
EMBEDDINGS_DB_PATH |
./embeddings.db |
SQLite database path for embeddings |
AUTH_HEADER |
X-Oidc-Username |
HTTP header for authentication (can be any header your TiddlyWiki expects) |
AUTH_USER |
mcp-user |
Username for TiddlyWiki API requests |
Usage
stdio Mode (Claude Desktop)
Add to your Claude Desktop configuration (claude_desktop_config.json):
{
"mcpServers": {
"tiddlywiki": {
"command": "npx",
"args": ["tiddlywiki-mcp-server"],
"env": {
"TIDDLYWIKI_URL": "http://localhost:8080"
}
}
}
}
HTTP Mode
Start the server:
TIDDLYWIKI_URL=http://localhost:8080 MCP_TRANSPORT=http MCP_PORT=3000 npx tiddlywiki-mcp-server
The server exposes:
GET /health- Health check endpointPOST /mcp- MCP JSON-RPC endpoint (stateless mode)
Example Tool Usage
Filter search (TiddlyWiki filter syntax):
{
"name": "search_tiddlers",
"arguments": {
"filter": "[tag[Journal]prefix[2025-01]]",
"includeText": true
}
}
Semantic search (natural language):
{
"name": "search_tiddlers",
"arguments": {
"semantic": "times I felt anxious about work",
"limit": 10
}
}
Hybrid search (filter + semantic reranking):
{
"name": "search_tiddlers",
"arguments": {
"filter": "[tag[Journal]]",
"semantic": "productivity tips",
"limit": 20
}
}
Development
Setup
npm install
Running Tests
npm test
Tests run quickly (~1s) and include unit tests for all tool handlers.
Linting
npm run lint # Check for issues
npm run format # Fix formatting
npm run format:check # Check formatting only
Type Checking
npm run typecheck
Pre-commit Hooks
Pre-commit hooks are configured with lefthook and run automatically:
- Format check (Prettier)
- Lint (ESLint)
- Tests (Vitest)
- Type check (TypeScript)
Building
npm run build
Architecture
src/
├── index.ts # Entry point, transport setup, server lifecycle
├── tiddlywiki-http.ts # TiddlyWiki HTTP API client
├── service-discovery.ts # URL resolution (direct URLs, Consul SRV, hostname:port)
├── filter-reference.ts # Filter syntax documentation
├── logger.ts # Structured logging
├── tools/ # MCP tool handlers
│ ├── types.ts # Shared types and Zod schemas
│ ├── search-tiddlers.ts
│ ├── create-tiddler.ts
│ ├── update-tiddler.ts
│ └── delete-tiddler.ts
└── embeddings/ # Semantic search infrastructure
├── database.ts # SQLite-vec database
├── ollama-client.ts # Ollama API client
└── sync-worker.ts # Background embedding sync
Key Design Decisions
- Stateless HTTP mode: Each request gets its own Server/Transport instance to prevent request ID collisions with concurrent clients
- Graceful degradation: Semantic search is optional; the server works without Ollama
- Token-aware responses: Search results are validated against token limits with pagination suggestions
- Background sync: Embeddings are updated periodically without blocking requests
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.