Backend Architect MCP Server
A specialized toolchain that guides AI agents through a structured 'Atomic Development' workflow for building Python FastAPI and Supabase backends. It manages project scaffolding and enforces dependency-ordered generation of database models, API routes, and tests.
README
Backend Architect MCP Server
An expert MCP toolchain designed to act as a Backend Architect for AI agents. This server enforces a strict "Atomic Development" workflow for building Python FastAPI + Supabase backends.
🚀 Overview
The Backend Architect server guides an agent through a Plan -> Prompt -> Write loop, ensuring that database models, API routes, and tests are built in the correct dependency order.
Key Features
- Atomic Development: Focuses on one component at a time.
- Workflow Enforcement: Models → Routes → Tests (respects model dependencies).
- Auto-Imports: Automatically updates
__init__.pyfiles for models and routes. - State Persistence: Maintains
.mcp_state.jsonto track building progress. - Contextual Prompts: Generates specialized system prompts for each component.
🛠️ Tech Stack
- Python 3.12
- MCP SDK (FastMCP)
- UV (Dependency Manager)
- Pydantic (State Validation)
📦 Installation
Ensure you have uv installed. Then, clone the repository and install dependencies:
# Clone the repository
cd mcp_fastapi
# Install dependencies and run the server
uv run server.py
🛠️ Tools Reference
1. Initialization
initialize_project(root_path: str = "."): Scaffolds the FastAPI project structure andpyproject.toml. Defaults to the current working directory.
2. Planning
save_roles_plan(roles: list): Define user roles and permissions.save_database_plan(models: list): Define SQLModel schemas and relationships.save_route_plan(routes: list): Define API endpoints and methods.save_test_plan(tests: list): Define simulation scenarios.
3. Execution
get_next_pending_task(): The "Traffic Cop" that tells you exactly what to build next.get_file_instruction(task_type: str, task_name: str): Returns a strict system prompt for the AI to follow.write_component_file(type: str, name: str, content: str): Writes the code and marks the task as "done".
🔄 The Loop
- Initialize: Set up your project root.
- Plan: Feed the architect your schemas and endpoints.
- Draft: Ask
get_next_pending_task()for the current objective. - Learn: Get instructions via
get_file_instruction(). - Write: Submit code via
write_component_file(). - Repeat: Until the entire backend is architected.
⚙️ MCP Configuration
Add this to your MCP settings file (e.g., mcp_config.json or your IDE's MCP settings):
{
"mcpServers": {
"backend-architect": {
"command": "uv",
"args": [
"run",
"--project",
"/path/to/server/directory",
"python",
"server.py"
]
}
}
}
[!TIP] Use the absolute path to the directory where you cloned this repository for the
--projectargument. This ensures the server can find its dependencies regardless of where your AI agent is currently working.
Built with ❤️ for the AI-First Developer.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.