Cloudscape Docs MCP Server

Cloudscape Docs MCP Server

Provides semantic search over AWS Cloudscape Design System documentation using natural language queries, enabling AI assistants to efficiently find and retrieve component documentation with token-efficient responses.

Category
Visit Server

README

Cloudscape Docs MCP Server

A Model Context Protocol (MCP) server that provides semantic search over AWS Cloudscape Design System documentation. Built for AI agents and coding assistants to efficiently query component documentation.

Features

  • Semantic Search - Find relevant documentation using natural language queries powered by Jina Code Embeddings 0.5B model
  • Token Efficient - Returns concise file lists first, full content on demand
  • Hardware Optimized - Automatic detection of Apple Silicon (MPS), CUDA, or CPU
  • Local Vector Store - Uses LanceDB for fast, file-based vector search

Transport

This server uses the MCP stdio transport protocol.
Streamable HTTP transport coming soon.

Tools

Tool Description
cloudscape_search_docs Search the documentation index. Returns top 5 relevant files with titles and paths.
cloudscape_read_doc Read the full content of a specific documentation file.

Cloudscape Docs MCP Tools in Action


Requirements

  • Python 3.13+
  • ~3GB disk space for the embedding model
  • 8GB+ RAM recommended

Installation

# Clone the repository
git clone https://github.com/praveenc/cloudscape-docs-mcp.git
cd cloudscape-docs-mcp

# Create virtual environment and install dependencies
uv sync

# Or with pip
pip install -e .

Setup

1. Add Documentation

Place your Cloudscape documentation files in the docs/ directory. Supported formats:

  • .md (Markdown)
  • .txt (Plain text)
  • .tsx / .ts (TypeScript/React)

2. Build the Index

Run the ingestion script to create the vector database:

uv run ingest.py

This will:

  • Scan all files in docs/
  • Chunk content into ~2000 character segments
  • Generate embeddings using Jina Code Embeddings 0.5B embedding model
  • Store vectors in data/lancedb/

Note: Running uv run ingest.py multiple times is safe but performs a full re-index each time. The script uses mode="overwrite" which drops and recreates the database table. There is no incremental update or change detection—all documents are re-scanned and re-embedded on every run. This is idempotent (same docs produce the same result) but computationally expensive for large documentation sets.

3. Run the Server

uv run server.py

MCP Client Configuration

Claude Desktop

Add to your mcp.json:

{
  "mcpServers": {
    "cloudscape-docs": {
      "command": "uv",
      "args": ["run", "--directory", "/path/to/cloudscape-docs-mcp", "python", "server.py"]
    }
  }
}

Cursor / VS Code / Windsurf / Kiro

Add to your MCP settings:

{
  "cloudscape-docs": {
    "command": "uv",
    "args": ["run", "--directory", "/path/to/cloudscape-docs-mcp", "python", "server.py"]
  }
}

Zed

Add to your Zed settings (settings.json):

{
  "context_servers": {
    "cloudscape-docs": {
      "command": {
        "path": "uv",
        "args": ["run", "--directory", "/path/to/cloudscape-docs-mcp", "python", "server.py"]
      }
    }
  }
}

Usage Example

Once connected, an AI assistant can:

  1. Search for components:

    User: "How do I use the Table component with sorting?"
    Agent: [calls cloudscape_search_docs("table sorting")]
    
  2. Read specific documentation:

    Agent: [calls cloudscape_read_doc("docs/components/table/sorting.md")]
    

Project Structure

cloudscape-docs-mcp/
├── server.py          # MCP server with search/read tools
├── ingest.py          # Documentation indexing script
├── pyproject.toml     # Project dependencies
├── docs/              # Documentation files (partially curated)
│   ├── components/    # Component documentation
│   ├── foundations/   # Design foundations
│   └── genai_patterns/# GenAI UI patterns
└── data/              # Generated vector database (gitignored)
    └── lancedb/

Configuration

Key settings in server.py and ingest.py:

Variable Default Description
MODEL_NAME jinaai/jina-code-embeddings-0.5b Embedding model
VECTOR_DIM 1536 Vector dimensions
MAX_UNIQUE_RESULTS 5 Max search results returned
DOCS_DIR ./docs Documentation source directory
DB_URI ./data/lancedb Vector database location

Development

# Install dev dependencies
uv sync --group dev

# Run with MCP inspector
npx @modelcontextprotocol/inspector uv --directory /path/to/cloudscape_docs run server.py
# Alternatively, use mcp cli to launch the server
mcp dev server.py

License

MIT License - See LICENSE for details.

Acknowledgments

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured