Cartesi Knowledge MCP Server
Enables AI agents to query curated Cartesi developer resources, documentation, and repository metadata through the Model Context Protocol. Provides read-only knowledge access and host-side workflow guidance for building Cartesi blockchain applications via streamable HTTP.
README
Cartesi Knowledge MCP Server
Production-minded Model Context Protocol server that exposes curated Cartesi developer resources from PostgreSQL to AI agents over streamable HTTP.
Current capabilities
- FastMCP (
mcp[cli]1.26.x) withstreamable_http_app()— use FastMCP’s Starlette app directly in production so session lifespan runs correctly (seecreate_app()insrc/main.py). - Async SQLAlchemy + asyncpg for read-only access to the knowledge database.
- Layered layout: config and logging (
src/core/), DB session and models (src/db/), repositories, domain service (src/domain/resource_service.py), schemas, formatters, and server modules undersrc/server/. - Transport security: DNS rebinding protection and configurable
allowed_hosts/allowed_originsinsrc/server/server.py(extend for your deployment hostname). - Plain HTTP health:
GET /healthzreturns{"status":"ok"}alongside the MCP route.
Knowledge responses are metadata and links (titles, URIs, canonical_url, doc routes). They do not include full fetched page bodies; agents should fetch external URLs when they need raw HTML or markdown.
Workflow tools (prepare_cartesi_*, send_input_to_application, prepare_*_deposit_instructions, get_cartesi_app_logic_guidance) only return instructions and command templates for the user’s machine. They do not run the Cartesi CLI, cast, or chain RPC from this server.
Requirements
- Python ≥ 3.11 (see
pyproject.toml; the includedDockerfileuses Python 3.12). - A PostgreSQL database populated with the curated resource schema expected by
src/db/models.pyandResourceService.
Environment variables
Copy .env.example to .env and adjust. Defaults and field names are defined in src/core/config.py (notably DATABASE_URL, APP_HOST, APP_PORT, MCP_BASE_URL, pagination limits).
Install
Using uv (recommended):
uv sync
Using pip:
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Run
python -m src.main
uv run python -m src.main
uv run uvicorn src.main:create_app --factory --host 0.0.0.0 --port 8000
The MCP endpoint is streamable HTTP at:
http://<host>:<port>/mcp(default:http://0.0.0.0:8000/mcp)
Docker
The repository includes a multi-stage Dockerfile that installs dependencies with uv and runs python -m src.main. Set DATABASE_URL and other env vars at runtime (for example via -e or your orchestrator).
Suggested client test
Use MCP Inspector or any MCP-compatible client and connect to:
http://localhost:8000/mcp
MCP resources
| URI | Purpose |
|---|---|
cartesi://health |
Server name, environment, MCP_BASE_URL, read-only flag, capabilities, content policy |
cartesi://resources |
Catalog: index of resource URIs, tool names, prompts, and suggested agent flow |
cartesi://resources/{resource_id} |
Normalized resource metadata |
cartesi://docs/{resource_id} |
Documentation resource view (same shape; non-doc IDs error) |
cartesi://docs/routes/{route_id} |
Single doc route with parent context |
cartesi://repositories/{resource_id} |
Repository sync / freshness metadata |
cartesi://collections/tag/{tag} |
Resources grouped by tag |
cartesi://collections/source/{source} |
Resources grouped by source |
MCP tools (registered names)
These are the name= values clients see (Python handler names may differ).
Knowledge
summarize_knowledge_base— coverage, counts, orientationget_knowledge_taxonomy— known tag and source titlessearch_knowledge_resources— search by query, tag, source, kindget_resource_detail— one resource by ID, optional routeslist_resource_doc_routes— routes for a documentation resourcesearch_documentation_routes— search routes across resourceslist_resources_for_tag/list_resources_for_sourceget_repository_sync_statusbuild_debugging_context— issue-focused bundle of resources and routes
Host-side Cartesi workflow (instructions only)
prepare_cartesi_create_command— stable v1.5.x vs alpha v2.0 create guidanceprepare_cartesi_build_commandprepare_cartesi_run_commandsend_input_to_application— InputBox +casttemplatesprepare_erc20_deposit_instructions— ERC20Portal flowprepare_erc721_deposit_instructions— ERC721Portal flowprepare_erc1155_deposit_instructions— ERC1155SinglePortal flowget_cartesi_app_logic_guidance— address-book, portals, vouchers, notices, reports
MCP prompts
debug_cartesi_issue— structured debugging using curated knowledgefind_cartesi_docs— doc route discovery for a topicexplain_repository_context— repository resource + status summary
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.