
Grafana MCP Server
A server that enables AI assistants to access and query Grafana dashboards, metrics, logs, and configurations through an MCP protocol interface.
README
Grafana MCP Server
Available Tools
The following tools are available via the MCP server:
- test_connection: Verify connectivity to your Grafana instance and configuration.
- grafana_promql_query: Execute PromQL queries against Grafana's Prometheus datasource. Fetches metrics data using PromQL expressions, optimizes time series responses to reduce token size.
- grafana_loki_query: Query Grafana Loki for log data. Fetches logs for a specified duration (e.g., '5m', '1h', '2d'), converts relative time to absolute timestamps.
- grafana_get_dashboard_config: Retrieves dashboard configuration details from the database. Queries the connectors_connectormetadatamodelstore table for dashboard metadata.
- grafana_query_dashboard_panels: Execute queries for specific dashboard panels. Can query up to 4 panels at once, supports template variables, optimizes metrics data.
- grafana_fetch_label_values: Fetch label values for dashboard variables from Prometheus datasource. Retrieves available values for specific labels (e.g., 'instance', 'job').
- grafana_fetch_dashboard_variables: Fetch all variables and their values from a Grafana dashboard. Retrieves dashboard template variables and their current values.
- grafana_fetch_all_dashboards: Fetch all dashboards from Grafana with basic information like title, UID, folder, tags, etc.
- grafana_fetch_datasources: Fetch all datasources from Grafana with their configuration details.
- grafana_fetch_folders: Fetch all folders from Grafana with their metadata and permissions.
🚀 Usage & Requirements
1. Get Your Grafana API Endpoint & API Key
- Ensure you have a running Grafana instance (self-hosted or cloud).
- Generate an API key from your Grafana UI:
- Go to Configuration → API Keys
- Create a new API key with appropriate permissions (Admin role recommended for full access)
- Copy the API key (starts with
glsa_
)
2. Installation & Running Options
2A. Install & Run with uv (Recommended for Local Development)
2A.1. Install dependencies with uv
uv venv .venv
source .venv/bin/activate
uv sync
2A.2. Run the server with uv
uv run grafana-mcp-server/src/grafana_mcp_server/mcp_server.py
- You can also use
uv
to run any other entrypoint scripts as needed. - Make sure your
config.yaml
is in the same directory asmcp_server.py
or set the required environment variables (see Configuration section).
2B. Run with Docker Compose (Recommended for Production/Containerized Environments)
- Edit
grafana-mcp-server/src/grafana_mcp_server/config.yaml
with your Grafana details (host, API key). - Start the server:
docker compose up -d
- The server will run in HTTP (SSE) mode on port 8000 by default.
- You can override configuration with environment variables (see below).
2C. Run with Docker Image (Manual)
- Build the image:
docker build -t grafana-mcp-server .
- Run the container (YAML config fallback):
docker run -d \ -p 8000:8000 \ -v $(pwd)/grafana-mcp-server/src/grafana_mcp_server/config.yaml:/app/config.yaml:ro \ --name grafana-mcp-server \ grafana-mcp-server
- Or run with environment variables (recommended for CI/Docker MCP clients):
docker run -d \ -p 8000:8000 \ -e GRAFANA_HOST="https://your-grafana-instance.com" \ -e GRAFANA_API_KEY="your-grafana-api-key-here" \ -e GRAFANA_SSL_VERIFY="true" \ -e MCP_SERVER_PORT=8000 \ -e MCP_SERVER_DEBUG=true \ --name grafana-mcp-server \ grafana-mcp-server
3. Configuration
The server loads configuration in the following order of precedence:
- Environment Variables (recommended for Docker/CI):
GRAFANA_HOST
: Grafana instance URL (e.g.https://your-grafana-instance.com
)GRAFANA_API_KEY
: Grafana API key (required)GRAFANA_SSL_VERIFY
:true
orfalse
(default:true
)MCP_SERVER_PORT
: Port to run the server on (default:8000
)MCP_SERVER_DEBUG
:true
orfalse
(default:true
)
- YAML file fallback (
config.yaml
):grafana: host: "https://your-grafana-instance.com" api_key: "your-grafana-api-key-here" ssl_verify: "true" server: port: 8000 debug: true
4. Integration with AI Assistants (e.g., Claude Desktop, Cursor)
You can integrate this MCP server with any tool that supports the MCP protocol. Here are the main options:
4A. Using Local Setup (with uv)
Before running the server locally, install dependencies and run with uv:
uv sync
Then add to your client configuration (e.g., claude-desktop.json
):
{
"mcpServers": {
"grafana": {
"command": "uv",
"args": [
"run",
"/full/path/to/grafana-mcp-server/src/grafana_mcp_server/mcp_server.py"
],
"env": {
"GRAFANA_HOST": "https://your-grafana-instance.com",
"GRAFANA_API_KEY": "your-grafana-api-key-here",
"GRAFANA_SSL_VERIFY": "true"
}
}
}
}
- Ensure your
config.yaml
is in the same directory asmcp_server.py
or update the path accordingly.
4B. Using Docker Compose or Docker (with environment variables)
{
"mcpServers": {
"grafana": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e",
"GRAFANA_HOST",
"-e",
"GRAFANA_API_KEY",
"-e",
"GRAFANA_SSL_VERIFY",
"grafana-mcp-server",
"-t",
"stdio"
],
"env": {
"GRAFANA_HOST": "https://your-grafana-instance.com",
"GRAFANA_API_KEY": "your-grafana-api-key-here",
"GRAFANA_SSL_VERIFY": "true"
}
}
}
}
- The
-t stdio
argument is supported for compatibility with Docker MCP clients (forces stdio handshake mode). - Adjust the volume path or environment variables as needed for your deployment.
4C. Connecting to an Already Running MCP Server (HTTP/SSE)
If you have an MCP server already running (e.g., on a remote host, cloud VM, or Kubernetes), you can connect your AI assistant or tool directly to its HTTP endpoint.
Example: Claude Desktop or Similar Tool
{
"mcpServers": {
"grafana": {
"url": "http://your-server-host:8000/mcp"
}
}
}
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
for local or unsecured deployments, andhttps
for production or secured deployments. - Make sure the server is accessible from your client machine (check firewall, security group, etc.).
Example: MCP Config YAML
mcp:
endpoint: "http://your-server-host:8000/mcp"
protocolVersion: "2025-06-18"
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
orhttps
in the URL schema depending on how you've deployed the MCP server. - No need to specify
command
orargs
—just point to the HTTP endpoint. - This works for any tool or assistant that supports MCP over HTTP.
- The server must be running in HTTP (SSE) mode (the default for this implementation).
Health Check
curl http://localhost:8000/health
The server runs on port 8000 by default.
5. Project Structure
grafana-mcp-server/
├── grafana-mcp-server/
│ └── src/
│ └── grafana_mcp_server/
│ ├── __init__.py
│ ├── config.yaml # Configuration file
│ ├── mcp_server.py # Main MCP server implementation
│ ├── stdio_server.py # STDIO server for MCP
│ └── processor/
│ ├── __init__.py
│ ├── grafana_processor.py # Grafana API processor
│ └── processor.py # Base processor interface
├── tests/
├── Dockerfile
├── docker-compose.yml
├── pyproject.toml
└── README.md
6. Troubleshooting
Common Issues
-
Connection Failed:
- Verify your Grafana instance is running and accessible
- Check your API key has proper permissions
- Ensure SSL verification settings match your setup
-
Authentication Errors:
- Verify your API key is correct and not expired
- Check if your Grafana instance requires additional authentication
-
Query Failures:
- Ensure datasource UIDs are correct
- Verify PromQL/Loki query syntax
- Check if the datasource is accessible with your API key
Debug Mode
Enable debug mode to get more detailed logs:
export MCP_SERVER_DEBUG=true
7. Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
8. License
This project is licensed under the MIT License - see the LICENSE file for details.
9. Support
- Need help anywhere? Join our slack community and message on #mcp channel.
- Want a 1-click MCP Server? Join the same community and let us know.
- For issues and questions, please open an issue on GitHub or contact the maintainers.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.