Wakapi MCP Server
Enables tracking and analyzing development time through the Wakapi API. Provides tools to retrieve coding statistics, project details, leaderboards, and recent activity logs for productivity insights.
README
Wakapi MCP Server
This is an MCP (Model Context Protocol) server. It provides development time tracking tools by collecting logs from Wakapi REST API.
This repository is unofficial. Use at your own risk.
Quick Start
Prerequisites
- Python 3.11 or higher
- Wakapi server with API access
MCP Server Configuration Examples
Quickstart with Environment Variables
{
"mcpServers": {
"wakapi": {
"env": {
"WAKAPI_URL": "http://localhost:3000",
"WAKAPI_API_KEY": "your-api-key"
},
"command": "uv",
"args": [
"tool",
"run",
"--from",
"git+https://github.com/impure0xntk/mcp-wakapi",
"wakapi-mcp"
]
}
}
}
Enhanced Security with Configuration File
{
"mcpServers": {
"wakapi": {
"command": "uv",
"args": [
"tool",
"run",
"--from",
"git+https://github.com/impure0xntk/mcp-wakapi",
"wakapi-mcp",
"--config",
"/path/to/config.toml"
]
}
}
}
Features
- Collects development time logs via the Wakapi API
- Provides MCP tools for retrieving development data
- Fast processing using FastMCP
- Reproducible development environment with Nix flakes
- Modular tool architecture
- Improved testability through dependency injection
- Design based on the single responsibility principle
Provided Tools
This server provides the following tools that can be used by MCP-compatible clients:
Note: {api_path} is configurable: by default, /compat/wakatime/v1.
Please see the Configuration section.
| Tool Name | Description | API Endpoint |
|---|---|---|
| Get Stats | Retrieve statistics for a given user over a specified time range | GET {api_path}/users/{user}/stats/{range} |
| Get Projects | Retrieve and filter the user projects | GET {api_path}/users/{user}/projects |
| Get User | Retrieve the given user information | GET {api_path}/users/{user} |
| Get Leaders | Retrieve leaderboard information | GET {api_path}/leaders |
| Get All Time Since Today | Retrieve all time information since today | GET {api_path}/users/{user}/all_time_since_today |
| Get Project Detail | Retrieve detailed information about a specific project | GET {api_path}/users/{user}/projects/{id} |
| Get Recent Logs | Retrieve recent development logs | GET {api_path}/users/{user}/heartbeats |
| Test Connection | Test connection to the Wakapi server | None |
Configuration Details
Environment Variables Configuration
The most common way to configure the server is through environment variables:
export WAKAPI_URL="http://your-wakapi-server:3000"
export WAKAPI_API_KEY="your_actual_api_key_here"
export WAKAPI_API_PATH="/compat/wakatime/v1"
Or pass to mcpServers:
{
"mcpServers": {
"wakapi": {
"env": {
"WAKAPI_URL": "http://localhost:3000",
"WAKAPI_API_KEY": "your-api-key"
},
...
Configuration Files
You can also use configuration files in TOML or JSON format:
TOML format (config.toml):
[wakapi]
url = "http://your-wakapi-server:3000"
api_key = "your_actual_api_key_here"
api_path = "/compat/wakatime/v1"
timeout = 30
retry_count = 3
[server]
host = "0.0.0.0"
port = 8000
[logging]
level = "INFO"
format = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
JSON format (config.json):
{
"wakapi": {
"url": "http://your-wakapi-server:3000",
"api_key": "your_actual_api_key_here",
"api_path": "/compat/wakatime/v1",
"timeout": 30,
"retry_count": 3
},
"server": {
"host": "0.0.0.0",
"port": 8000
},
"logging": {
"level": "INFO",
"format": "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
}
}
For Developers
Setup Development Environment
This project provides a reproducible development environment using Nix flakes:
# Start the development environment
nix develop
# Or start a shell
nix-shell
And also use uv to activate, sync and run.
Starting the MCP Server from python command
# Set environment variables
export WAKAPI_URL="http://localhost:3000"
export WAKAPI_API_KEY="your_actual_api_key_here"
export WAKAPI_API_PATH="/compat/wakatime/v1"
# Start the server in STDIO mode (default)
python main.py --transport stdio
# Start the server in SSE (HTTP) mode
python main.py --transport sse --port 8001
# Start with a configuration file
python main.py --config /path/to/config.toml
Authentication Method: The API key is automatically base64-encoded and sent as a Bearer token.
--transport stdio: Uses STDIO transport (default). Can be used directly with MCP clients like opencode--transport sse --port 8001: Uses SSE (HTTP) transport. Accessible via browser or HTTP
Testing
You can test the server using pytest:
# Run all tests
pytest
# Run specific tests
pytest tests/test_mcp_server.py -v
License
Apache License 2.0
Contributing
Issues and Pull Requests are welcome.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.