WakaTime MCP Server
Provides access to WakaTime coding analytics data through MCP tools. Enables querying coding stats, activity summaries, project lists, and time tracking information from your WakaTime account.
README
WakaTime MCP Server
A Model Context Protocol (MCP) server that provides high-signal coding analytics from your WakaTime data.
- Direct mode: FastMCP serves Streamable HTTP at
http://localhost:8000/mcp - Proxy mode:
mcp-proxyexposes the server over SSE/HTTP and Caddy adds token auth (recommended for self-hosting)
Tooling / API
| Tool | Purpose | Key arguments |
|---|---|---|
get_coding_stats |
Detailed stats for a period | range (last_7_days, last_30_days, last_6_months, last_year, all_time) |
get_summary |
Activity breakdown for a date/range | start_date, end_date, project |
get_all_time |
Total coding time since account creation | project (optional) |
get_status_bar |
Current day status (like editor status bar) | (none) |
list_projects |
List/search tracked projects | query (optional) |
Configuration
Configure via environment variables (or a .env file for the self-hosted scripts).
| Variable | Description | Required |
|---|---|---|
WAKATIME_API_KEY |
Your API key from https://wakatime.com/settings/api-key | Yes |
MCP_AUTH_KEY |
Token for auth proxy (proxy/self-hosted mode) | Proxy mode |
PORT |
Direct-mode port (default: 8000) |
No |
Development (Direct mode)
-
Install (uv required)
git clone https://github.com/dpshade/wakatime-mcp.git cd wakatime-mcp uv sync --no-install-project -
Run
WAKATIME_API_KEY="your_wakatime_api_key_here" uv run -- python src/server.py -
Connect
- URL:
http://localhost:8000/mcp - Auth: none
- URL:
Deployment
Option 1: Self-hosted with auth (recommended)
This mode runs the MCP server with FastMCP’s default stdio transport and uses mcp-proxy to expose it over HTTP:
Internet -> (optional Tailscale Funnel) -> Caddy (auth) -> mcp-proxy -> FastMCP (stdio)
:8770 :8767
-
Configure
cp .env.example .env # Edit .env: set WAKATIME_API_KEY and a strong MCP_AUTH_KEY -
Download Caddy (auth proxy)
curl -L https://github.com/caddyserver/caddy/releases/latest/download/caddy_linux_amd64 -o deploy/caddy chmod +x deploy/caddy -
Start
./deploy/start.sh -
Endpoints
- Auth proxy (recommended):
- SSE:
http://localhost:8770/sse - Streamable HTTP:
http://localhost:8770/mcp
- SSE:
- Internal (no auth; do not expose publicly):
- SSE:
http://localhost:8767/sse - Streamable HTTP:
http://localhost:8767/mcp
- SSE:
- Auth proxy (recommended):
mcp-proxy also exposes a health endpoint at http://localhost:8767/status (and via auth proxy at http://localhost:8770/status).
Systemd (persistent)
./deploy/install-systemd.sh
sudo systemctl enable --now mcp-wakatime mcp-wakatime-auth
Optional: Tailscale Funnel
If you use Tailscale, you can publish the auth proxy port:
tailscale funnel --bg --set-path=/wakatime localhost:8770
tailscale funnel --bg 443 on
Option 2: Docker
Runs mcp-proxy + the server in a container.
cd deploy
docker-compose up -d
- Endpoint (no auth):
http://localhost:8767/sse - If you want auth, run Caddy on the host (or add it to your own compose stack) and proxy to
8767.
Option 3: Render
This repo includes render.yaml for deploying the direct Python server.
- Set environment variable:
WAKATIME_API_KEY - Your service endpoint will be:
https://<your-service>/mcp
Client setup
MCP Inspector
npx @modelcontextprotocol/inspector
Then connect using:
- Direct mode:
http://localhost:8000/mcp(Streamable HTTP) - Proxy mode:
http://localhost:8770/sse(SSE)
Poke / other hosted clients (proxy mode)
Use the auth proxy SSE endpoint and send MCP_AUTH_KEY via one of:
Authorization: Bearer <MCP_AUTH_KEY>X-API-Key: <MCP_AUTH_KEY>Api-Key: <MCP_AUTH_KEY>
Security notes
- Generate a strong auth key:
openssl rand -hex 32 - Never expose the unauthenticated
mcp-proxyport (8767) to the public internet.
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.