Cartesi Knowledge MCP Server

Cartesi Knowledge MCP Server

Enables AI agents to query curated Cartesi developer resources, documentation, and repository metadata through the Model Context Protocol. Provides read-only knowledge access and host-side workflow guidance for building Cartesi blockchain applications via streamable HTTP.

Category
Visit Server

README

Cartesi Knowledge MCP Server

Production-minded Model Context Protocol server that exposes curated Cartesi developer resources from PostgreSQL to AI agents over streamable HTTP.

Current capabilities

  • FastMCP (mcp[cli] 1.26.x) with streamable_http_app() — use FastMCP’s Starlette app directly in production so session lifespan runs correctly (see create_app() in src/main.py).
  • Async SQLAlchemy + asyncpg for read-only access to the knowledge database.
  • Layered layout: config and logging (src/core/), DB session and models (src/db/), repositories, domain service (src/domain/resource_service.py), schemas, formatters, and server modules under src/server/.
  • Transport security: DNS rebinding protection and configurable allowed_hosts / allowed_origins in src/server/server.py (extend for your deployment hostname).
  • Plain HTTP health: GET /healthz returns {"status":"ok"} alongside the MCP route.

Knowledge responses are metadata and links (titles, URIs, canonical_url, doc routes). They do not include full fetched page bodies; agents should fetch external URLs when they need raw HTML or markdown.

Workflow tools (prepare_cartesi_*, send_input_to_application, prepare_*_deposit_instructions, get_cartesi_app_logic_guidance) only return instructions and command templates for the user’s machine. They do not run the Cartesi CLI, cast, or chain RPC from this server.

Requirements

  • Python ≥ 3.11 (see pyproject.toml; the included Dockerfile uses Python 3.12).
  • A PostgreSQL database populated with the curated resource schema expected by src/db/models.py and ResourceService.

Environment variables

Copy .env.example to .env and adjust. Defaults and field names are defined in src/core/config.py (notably DATABASE_URL, APP_HOST, APP_PORT, MCP_BASE_URL, pagination limits).

Install

Using uv (recommended):

uv sync

Using pip:

python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Run

python -m src.main
uv run python -m src.main
uv run uvicorn src.main:create_app --factory --host 0.0.0.0 --port 8000

The MCP endpoint is streamable HTTP at:

  • http://<host>:<port>/mcp (default: http://0.0.0.0:8000/mcp)

Docker

The repository includes a multi-stage Dockerfile that installs dependencies with uv and runs python -m src.main. Set DATABASE_URL and other env vars at runtime (for example via -e or your orchestrator).

Suggested client test

Use MCP Inspector or any MCP-compatible client and connect to:

http://localhost:8000/mcp

MCP resources

URI Purpose
cartesi://health Server name, environment, MCP_BASE_URL, read-only flag, capabilities, content policy
cartesi://resources Catalog: index of resource URIs, tool names, prompts, and suggested agent flow
cartesi://resources/{resource_id} Normalized resource metadata
cartesi://docs/{resource_id} Documentation resource view (same shape; non-doc IDs error)
cartesi://docs/routes/{route_id} Single doc route with parent context
cartesi://repositories/{resource_id} Repository sync / freshness metadata
cartesi://collections/tag/{tag} Resources grouped by tag
cartesi://collections/source/{source} Resources grouped by source

MCP tools (registered names)

These are the name= values clients see (Python handler names may differ).

Knowledge

  • summarize_knowledge_base — coverage, counts, orientation
  • get_knowledge_taxonomy — known tag and source titles
  • search_knowledge_resources — search by query, tag, source, kind
  • get_resource_detail — one resource by ID, optional routes
  • list_resource_doc_routes — routes for a documentation resource
  • search_documentation_routes — search routes across resources
  • list_resources_for_tag / list_resources_for_source
  • get_repository_sync_status
  • build_debugging_context — issue-focused bundle of resources and routes

Host-side Cartesi workflow (instructions only)

  • prepare_cartesi_create_command — stable v1.5.x vs alpha v2.0 create guidance
  • prepare_cartesi_build_command
  • prepare_cartesi_run_command
  • send_input_to_application — InputBox + cast templates
  • prepare_erc20_deposit_instructions — ERC20Portal flow
  • prepare_erc721_deposit_instructions — ERC721Portal flow
  • prepare_erc1155_deposit_instructions — ERC1155SinglePortal flow
  • get_cartesi_app_logic_guidance — address-book, portals, vouchers, notices, reports

MCP prompts

  • debug_cartesi_issue — structured debugging using curated knowledge
  • find_cartesi_docs — doc route discovery for a topic
  • explain_repository_context — repository resource + status summary

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured