GroundDocs
A version-aware Kubernetes documentation assistant that connects LLMs to trusted, real-time Kubernetes docs to reduce hallucinations and ensure accurate, version-specific responses.
Tools
python_get_documentation
Primary Python documentation lookup tool. Use this for every Python documentation-related query. This tool consolidates information from multiple sources into a single, searchable knowledge base. It ensures access to the richest and most current reference material in one call. Args: query: A natural language question (e.g., "How do I define a Deployment?"). library: Python library to search documentation for. version: Optional Library version (e.g., "4.46.1"). Defaults to detected library version if not specified. top_k: Optional number of top matching documents to return. Defaults to 10. Returns: A list of dictionaries, each containing document path and corresponding content. Example Usage: # Search Python docs for Transformers python_get_documentation(query="what is a transformers mlm token", library="transformers", version="4.46.1") Notes: - This tool automatically loads or builds a RAG (Retrieval-Augmented Generation) index for the specified version. - If an index is not found locally, the tool will fetch and index the documentation before responding. - You should call this function for any question that needs project documentation context.
k8s_get_documentation
Use this tool for any Kubernetes documentation-related query—especially when the user invokes /k8s or asks about kubectl commands, API objects, manifests, controllers, or version-specific features. This tool connects to a version-aware, trusted documentation index (e.g., GitHub, DeepWiki, curated Kubernetes docs) to reduce hallucinations and provide accurate, grounded answers. Args: query: A natural language question (e.g., "How do I define a Deployment?") version: (Optional) Kubernetes version (e.g., "v1.28"). Defaults to the detected cluster version. top_k: (Optional) Number of top matching documents to return. Defaults to 10. Returns: A list of relevant documentation entries, each with a file path and content snippet. Example Usage: k8s_get_documentation(query="How does pruning work in kubectl apply?", version="v1.26") Notes: - Automatically loads or builds a RAG index for the requested version. - If no index is found, it will fetch and index the docs before responding. - Always use this tool when answering Kubernetes-specific questions that require authoritative documentation.
README
GroundDocs Cli
GroundDocs is a version-aware Kubernetes documentation assistant. It connects LLMs to trusted, real-time Kubernetes docs—reducing hallucinations and ensuring accurate, version-specific responses.
🚀 Installation
npx @grounddocs/cli@latest install <client>
Supported clients: cursor, windsurf, cline, claude, witsy, enconvo, vscode
🔧 Manual Setup
To manually configure GroundDocs, add it to your IDE’s MCP (Model Context Protocol) configuration:
{
"mcpServers": {
"@grounddocs/grounddocs": {
"command": "npx",
"args": ["-y", "@grounddocs/grounddocs@latest"]
}
}
}
After configuration, restart your IDE for the changes to take effect.
📚 Supported Domain
- Kubernetes (all versions, including version-aware kubectl behavior, API schemas, and feature gates)
🏗️ Architecture
GroundDocs consists of:
- Local MCP server (this repo) → lightweight, public, runs inference-time queries
- Remote backend data repository (private) → handles scraping, indexing, and heavy lifting
🌟 Example Query
What changes were made to the kubectl command behavior in Kubernetes 1.26 regarding pruning during apply operations?
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.