Honeybadger MCP Server
Enables AI agents to interact with the Honeybadger error monitoring service to list, filter, and analyze fault data. It provides tools for fetching error lists and retrieving detailed fault information from Honeybadger projects.
README
Honeybadger MCP Server
A Model Context Protocol (MCP) server implementation for interacting with the Honeybadger API. This server allows AI agents to fetch and analyze error data from your Honeybadger projects.
Overview
This MCP server provides a bridge between AI agents and the Honeybadger error monitoring service. It follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.
Features
The server provides two essential tools for interacting with Honeybadger:
-
list_faults: List and filter faults from your Honeybadger project- Search by text query
- Filter by creation or occurrence timestamps
- Sort by frequency or recency
- Paginate results
-
get_fault_details: Get detailed information about specific faults- Filter notices by creation time
- Paginate through notices
- Results ordered by creation time descending
Prerequisites
- Python 3.10+
- Honeybadger API key and Project ID
- Docker if running the MCP server as a container (recommended)
Installation
Using uv
-
Install uv if you don't have it:
pip install uv -
Clone this repository:
git clone https://github.com/bobtista/honeybadger-mcp.git cd honeybadger-mcp -
Install dependencies:
uv pip install -e . -
Install development dependencies (optional):
uv pip install -e ".[dev]" -
Create your environment file:
cp .env.example .env # Edit .env with your configuration
Using Docker (Recommended)
-
Build the Docker image:
docker build -t honeybadger/mcp --build-arg PORT=8050 . -
Create a
.envfile and configure your environment variables
Configuration
You can configure the server using either environment variables or command-line arguments:
| Option | Env Variable | CLI Argument | Default | Description |
|---|---|---|---|---|
| API Key | HONEYBADGER_API_KEY | --api-key | Required | Your Honeybadger API key |
| Project ID | HONEYBADGER_PROJECT_ID | --project-id | Required | Your Honeybadger project ID |
| Transport | TRANSPORT | --transport | sse | Transport protocol (sse or stdio) |
| Host | HOST | --host | 127.0.0.1 | Host to bind to when using SSE transport |
| Port | PORT | --port | 8050 | Port to listen on when using SSE transport |
| Log Level | LOG_LEVEL | --log-level | INFO | Logging level (INFO, DEBUG, etc.) |
Running the Server
Running with uv (Development)
SSE Transport (Default)
# Using environment variables:
HONEYBADGER_API_KEY=your-key HONEYBADGER_PROJECT_ID=your-project uv run src/honeybadger_mcp_server/server.py
# Using CLI arguments:
uv run src/honeybadger_mcp_server/server.py --api-key your-key --project-id your-project
Using Stdio
uv run src/honeybadger_mcp_server/server.py --transport stdio --api-key your-key --project-id your-project
Running Installed Package
SSE Transport (Default)
# Using environment variables:
HONEYBADGER_API_KEY=your-key HONEYBADGER_PROJECT_ID=your-project honeybadger-mcp-server
# Using CLI arguments:
honeybadger-mcp-server --api-key your-key --project-id your-project
Using Stdio
honeybadger-mcp-server --transport stdio --api-key your-key --project-id your-project
Using Docker
Run with SSE
docker run --env-file .env -p 8050:8050 honeybadger/mcp
Using Stdio
With stdio, the MCP client itself can spin up the MCP server container, so nothing to run at this point.
Integration with MCP Clients
SSE Configuration
Once you have the server running with SSE transport, you can connect to it using this configuration:
{
"mcpServers": {
"honeybadger": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}
Claude Desktop Configuration
Using SSE Transport (Recommended)
First, start the server:
honeybadger-mcp-server --api-key your-key --project-id your-project
Then add to your Claude Desktop config:
{
"mcpServers": {
"honeybadger": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}
Using Stdio Transport
Add to your Claude Desktop config:
{
"mcpServers": {
"honeybadger": {
"command": "uv",
"args": [
"run",
"--project",
"/path/to/honeybadger-mcp",
"src/honeybadger_mcp_server/server.py",
"--transport",
"stdio",
"--api-key",
"YOUR-API-KEY",
"--project-id",
"YOUR-PROJECT-ID"
]
}
}
}
Docker Configuration
{
"mcpServers": {
"honeybadger": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"honeybadger/mcp",
"--transport",
"stdio",
"--api-key",
"YOUR-API-KEY",
"--project-id",
"YOUR-PROJECT-ID"
]
}
}
}
Tool Usage Examples
List Faults
result = await client.call_tool("list_faults", {
"q": "RuntimeError", # Optional search term
"created_after": 1710806400, # Unix timestamp (2024-03-19T00:00:00Z)
"occurred_after": 1710806400, # Filter by occurrence time
"limit": 10, # Max 25 results
"order": "recent" # 'recent' or 'frequent'
})
Get Fault Details
result = await client.call_tool("get_fault_details", {
"fault_id": "abc123",
"created_after": 1710806400, # Unix timestamp
"created_before": 1710892800, # Optional end time
"limit": 5 # Number of notices (max 25)
})
Development
Running Tests
# Install dev dependencies
uv pip install -e ".[dev]"
# Run tests
pytest
Code Quality
# Run type checker
pyright
# Run linter
ruff check .
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.