Loki MCP Server
Enables AI models to query and analyze Kubernetes cluster logs through Grafana Loki, supporting semantic operations like error aggregation and pod restart detection. It provides tools for regex-based log searching and namespace discovery to facilitate natural language troubleshooting.
README
Loki MCP Server
An MCP (Model Context Protocol) server for semantic log querying via Loki. Designed to help AI models answer natural language questions about your cluster logs.
Features
- Error Summary: Aggregate errors across your cluster with breakdowns by type
- Pod Restart Detection: Find crashing/restarting pods
- Log Search: Regex-based log search across your cluster
- Namespace/Pod Discovery: List available namespaces and query specific pods
- Semantic Tool Design: Tool names and parameters match natural language questions
Tools
get_error_summary
Get a summary of errors happening in your cluster.
namespace: Filter to specific namespace (empty = all)hours: Look back this many hours (default: 1)
find_pod_restarts
Find pods that have restarted or crashed recently.
namespace: Filter to specific namespace (empty = all)hours: Look back this many hours (default: 1)
search_logs
Search logs with a regex pattern.
query: Regex pattern to search fornamespace: Filter to specific namespace (empty = all)hours: Look back this many hours (default: 1)limit: Maximum log lines to return (default: 100)
list_namespaces
List all namespaces that have logs in Loki.
get_pod_logs
Get logs for a specific pod.
pod_name: Pod name (supports wildcards likeollama*)namespace: Namespace of the pod (empty = search all)hours: Look back this many hours (default: 1)limit: Maximum log lines to return (default: 100)
Development
# Install dependencies
uv sync
# Run the server
uv run server.py
The server exposes FastMCP over streamable HTTP at /mcp (default port 8000).
Deployment (In-Cluster)
Add to your local-k8s-apps Helm values with:
- Deployment with the MCP server
- Exposed via HTTP on port 8000
- Environment variable:
LOKI_URLpointing to Loki service
The Ollama MCP bridge will load this server's tools and expose them to the model.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.