Wiki Analytics Specification MCP Server
Enables AI coding tools to query and validate analytics event specifications maintained as Wiki markdown tables, providing structured access to events, properties, and implementation details through MCP.
README
Wiki Analytics Specification MCP Server
A system that maintains analytics event specifications in a Wiki, then transforms them into formats that AI coding tools to query efficiently via Model Context Protocol (MCP).
Overview
Analytics specs often live in scattered documentation that's hard for developers to consume, or in technical formats that PMs and data scientists can't easily maintain. This project bridges that gap: non-technical stakeholders author specs in familiar Wiki markdown tables, while developers get structured, queryable data through AI coding tools.
This project enables a Wiki-based workflow for managing analytics specifications:
- Author in Wiki - Define events, properties, and property groups using markdown tables
- Build automatically - Convert Wiki markdown → CSV → JavaScript modules
- Query with Claude - MCP server provides tools for Claude to search and validate specs
Note: This project uses GitHub/GitLab wiki conventions, where wikis are stored as markdown files in a separate git repository (e.g., repo.wiki.git). This allows the wiki content to be cloned and processed programmatically.
Features
- Wiki-based authoring - Human-friendly markdown tables with version control
- Property reuse - Define properties once, reference everywhere via property groups
- Compact responses - MCP tools return structured JSON, reducing token usage by ~66%
- Validation support - Validate tracking implementations against specs
- Local execution - Runs locally with Claude Desktop, no cloud hosting required
Installation
Note:
- This project is currently optimized for GitHub (template repositories, GitHub Actions, GitHub wikis). The concepts translate to other platforms like GitLab, but implementation details differ.
- In Github, adding a wiki to a private repo requires a paid plan.
Use as Template (Recommended)
This project is designed as a template for your own analytics specifications.
On GitHub:
- Click "Use this template" → "Create a new repository" on GitHub
- Clone your new repository locally and install dependencies:
git clone https://github.com/yourusername/your-repo-name.git cd your-repo-name npm install # Automatically sets up git hooks - Set up your wiki with example content:
- Go to your repository's Wiki tab on GitHub
- Create pages:
Events.md,Property-Groups.md,Properties.md - Copy content from this project's
wiki-examples/directory
- Trigger the build workflow:
- Go to Actions tab → "Transform Wiki to Specs" → "Run workflow"
- Pull the generated specs:
git pull - Configure with your AI tool (see "Configure with AI Coding Tools" below)
Optional enhancements:
- Enable automated sync by uncommenting the cron job in
.github/workflows/transform-wiki.yml - Add GitHub branch protection rules for additional server-side protection (Husky hooks already block local commits)
Requirements:
- Node.js 20+
- Git
- GitHub account (for template and CI/CD workflow)
Quick Test (Using Example Data)
To test the MCP server without setting up a wiki:
# Clone the repository
git clone https://github.com/username/wiki-mcp-analytics.git
cd wiki-mcp-analytics
# Install dependencies
npm install
# Build from example data
npm run build:example
# Start the MCP server
npm start
This uses the wiki-examples/ directory to generate test specs.
Usage
Build Specs from Wiki
# Run full build pipeline (Wiki markdown → CSV → JavaScript)
npm run build
# Run individual steps
npm run build:csv # Wiki markdown → CSV only
npm run build:js # CSV → JavaScript only
npm run build:example # Use wiki-examples/ for testing
Note: Developers typically don't need to run build commands. The CI/CD workflow automatically generates and commits specs when the wiki changes. Just git pull to get the latest.
Start the MCP Server
npm start
Configure with AI Coding Tools
Claude Code
claude mcp add wiki-analytics node /path/to/wiki-mcp-analytics/src/mcp-server/index.js
Other MCP-compatible tools
Add to your MCP configuration file:
{
"mcpServers": {
"wiki-analytics": {
"command": "node",
"args": ["/path/to/wiki-mcp-analytics/src/mcp-server/index.js"]
}
}
}
Wiki Format
The Wiki uses three markdown pages with tables where each row represents one item.
Events.md
| Event Name | Event Table | Event Description | Property Groups | Additional Properties | Notes |
|---|---|---|---|---|---|
| user_registered | Registration | User completed registration | user_context<br>device_info | registration_method<br>referral_code | Fire after successful registration |
Property-Groups.md
| Group Name | Description | Properties |
|---|---|---|
| user_context | Common user identification properties | user_id<br>email<br>account_created_at |
Properties.md
| Property Name | Type | Constraints | Description | Usage |
|---|---|---|---|---|
| user_id | string | regex: ^[0-9a-f-]{36}$ | Unique user identifier | Include in all authenticated events |
Key conventions:
- Use
<br>for line breaks in multi-value cells - All properties must be defined in Properties.md
- Events and property groups reference properties by name only
Project Structure
wiki-mcp-analytics/
├── src/
│ ├── builder/ # Build pipeline (Wiki → CSV → JS)
│ │ ├── index.js # Pipeline orchestration
│ │ ├── wiki-to-csv.js # Parse markdown → CSV
│ │ └── csv-to-javascript.js # Generate JS modules
│ └── mcp-server/ # MCP server implementation
│ └── index.js
├── specs/ # Generated specs (committed by CI/CD)
│ ├── csv/ # CSV format for tools
│ │ ├── .gitkeep
│ │ └── *.csv (generated)
│ └── javascript/ # JS modules for runtime
│ ├── .gitkeep
│ └── */ (generated)
├── .husky/ # Git hooks (pre-commit protection)
│ └── pre-commit
├── wiki-examples/ # Example wiki content for testing
│ ├── Events.md
│ ├── Property-Groups.md
│ └── Properties.md
└── package.json
MCP Tools
The server provides developer-focused tools for implementation and validation:
get_event_implementation
Get complete event specification with all properties expanded.
// Returns structured JSON with property groups, constraints, and notes
get_event_implementation("user_registered")
validate_event_payload
Validate a tracking implementation against the spec.
// Returns errors, warnings, and valid fields
validate_event_payload("user_registered", { user_id: "123", ... })
search_events
Find events by criteria.
// Search by name, table, or property usage
search_events({ query: "registration", has_property: "user_id" })
get_property_details
Get property definition and usage across events.
// Returns type, constraints, description, and where it's used
get_property_details("user_id")
get_related_events
Find events in the same flow/table.
// Returns related events for funnel analysis
get_related_events("user_registered")
Architecture
Wiki Repo (separate git repository)
↓ (sync via CI/CD)
Main Repo: wiki-mcp-analytics
↓ (build pipeline)
specs/csv/ + specs/javascript/
↓ (read by)
MCP Server (runs locally)
↓ (stdio)
Claude Desktop / Claude Code
Note: GitHub/GitLab wikis are separate repositories with a .wiki suffix. This project syncs from the wiki repo and builds the specs.
Development
Automated Workflow
When you update your wiki, the GitHub Action automatically:
- Detects wiki changes
- Builds fresh specs (CSV + JavaScript)
- Commits to your repo as
github-actions[bot] - Developers pull the updated specs
Optional: Enable daily sync by uncommenting the cron schedule in .github/workflows/transform-wiki.yml
Local Development
# Test the builder with example data (no wiki setup needed)
npm run build:example
# Build from your wiki (requires wiki/ directory cloned locally)
git clone https://github.com/yourname/wiki-mcp-analytics.wiki.git wiki
npm run build
# Run the MCP server
npm start
Protection Against Stale Commits
The project includes a pre-commit hook (via Husky) that blocks manual commits to specs/. This ensures only CI/CD commits generated specs.
To bypass (not recommended): git commit --no-verify
For additional protection, consider setting up branch protection rules to restrict specs/ changes.
License
MIT License - see LICENSE for details.
Related
- Model Context Protocol - The protocol this server implements
- Claude Desktop - AI assistant that connects to MCP servers
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.