mcp-base

mcp-base

A bare-bones FastMCP server template designed to serve as a starting point for building custom Model Context Protocol servers. It provides a foundational structure for implementing tools over HTTP and includes a built-in health check utility.

Category
Visit Server

README

mcp-filesystem-readonly

License: GPL v3 Python 3.13+

A read-only filesystem FastMCP server. Configure a root directory and let AI assistants browse its contents via MCP tools.

MCP (Model Context Protocol) is an open standard that lets AI assistants call external tools and services. This server implements MCP over HTTP so any MCP-compatible AI application can reach it.


Prerequisites

  • Docker — for the Docker Compose deployment path
  • uv — for the source deployment path (see Installing uv)
  • Node.js — required for the git commit hooks; the hooks use commitlint to enforce Conventional Commits, which is the best-in-class Node.js tool for commit message validation

Customising the Template

1. Copy the template

On GitHub — click Use this template → Create a new repository. This creates a clean copy with no fork relationship and no template history.

Without GitHub — clone, strip the history, and reinitialise:

git clone https://github.com/sesopenko/mcp-base.git my-project
cd my-project
rm -rf .git
git init
git add .
git commit -m "chore: bootstrap from mcp-base template"

2. Customise identity values

Edit project.env to set your own values (Docker image name, package name, project name, description), then run the setup script to substitute them throughout the repository:

bash scripts/apply-project-config.sh

The script is idempotent — safe to run multiple times.


Quick Start

Option A — Docker Compose

  1. Create a docker-compose.yml:

    services:
      mcp-filesystem-readonly:
        image: sesopenko/mcp-filesystem-readonly:latest
        ports:
          - "8080:8080"
        volumes:
          - ./config.toml:/config/config.toml:ro
          - /mnt/video:/mnt/video:ro
        restart: unless-stopped
    
  2. Copy the example config and edit it:

    cp config.toml.example config.toml
    
  3. Start the server:

    docker compose up -d
    

Option B — Run from Source

  1. Install uv if you haven't already.

  2. Install dependencies:

    uv sync
    
  3. Copy the example config and edit it:

    cp config.toml.example config.toml
    
  4. Start the server:

    uv run python -m mcp_base
    

Security

This server has no authentication on its MCP endpoint. It is designed for LAN use only.

Do not expose this server directly to the internet.

If you need to access it remotely, place it behind a reverse proxy that handles TLS termination and access control. Configuring a reverse proxy is outside the scope of this project.


Configuration

Create a config.toml in the working directory (or pass --config <path>):

[server]
host = "0.0.0.0"
port = 8080

[logging]
level = "info"

[filesystem]
roots = "/mnt/video"

[server]

Key Default Description
host "0.0.0.0" Address the MCP server listens on. 0.0.0.0 binds all interfaces.
port 8080 Port the MCP server listens on.

[logging]

Key Default Description
level "info" Log verbosity. One of: debug, info, warning, error.

[filesystem]

Key Required Description
roots yes Comma-separated list of absolute paths exposed via list_folder. Only paths within one of these roots can be listed.

Connecting an AI Application

This server uses the Streamable HTTP MCP transport. Clients communicate via HTTP POST with streaming responses — opening the endpoint in a browser will return a Not Acceptable error, which is expected.

Point your MCP-compatible AI application at the server's MCP endpoint:

http://<host>:<port>/mcp

For example, if the server is running on 192.168.1.10 with the default port:

http://192.168.1.10:8080/mcp

Consult your AI application's documentation for how to register an MCP server. Ensure it supports the Streamable HTTP transport (most modern MCP clients do).


Example System Prompt

XML is preferred over markdown for system prompts because explicit named tags give unambiguous semantic meaning — the AI always knows exactly what each block contains. Markdown headings require inference and are more likely to be misinterpreted.

Copy and adapt this prompt to give your AI assistant clear guidance on using the tools.

Tip — let an LLM write this for you. XML-structured system prompts are effective but unfamiliar to most developers and tedious to write by hand. A quick conversation with any capable LLM (describe your tools, what they do, and how you want the assistant to behave) will produce a well-structured prompt you can drop straight in. The results are often better than anything written manually as plain text or markdown.

  • XML tags act like labeled folders — the model knows exactly where each piece of information starts and stops
  • Training data is full of structured markup, so models already "think" in tags naturally
  • Tags prevent the model from confusing your instructions with the content it's working on
<system>
  <role>
    You are a helpful assistant with access to a read-only filesystem MCP server.
    Use the available tools to browse and describe files at the user's request.
  </role>
  <tools>
    <tool name="health_check">Check that the MCP server is running and reachable.</tool>
    <tool name="list_root_paths">Return the configured root directory paths. Call this first to discover the starting points for file listing.</tool>
    <tool name="list_folder">List the contents of a directory. Requires an absolute path within one of the configured roots. Returns name, size_mb, date_created, date_modified, and is_folder for each entry. Pass folders_only=true to list only subdirectories.</tool>
  </tools>
  <guidelines>
    <item>Call health_check if the user asks whether the server is available.</item>
    <item>Call list_root_paths before attempting to list files so you know where to start.</item>
    <item>Use list_folder with a path returned by list_root_paths to browse the filesystem.</item>
    <item>Do not guess paths — only navigate to paths you have discovered through the tools.</item>
  </guidelines>
</system>

Available Tools

Tool Description
health_check Returns {"status": "ok"} to confirm the server is running.
list_root_paths Returns the configured root directory paths. Call this first to discover where to start listing.
list_folder Lists the contents of a directory within one of the configured roots. Returns name, size (MB), dates, and type for each entry.

Architecture

The template follows a clean three-layer separation:

File Purpose
src/mcp_base/tools.py Pure Python functions — one function per tool, no framework coupling
src/mcp_base/server.py FastMCP wiring — registers tool functions with @mcp.tool() and runs the server
src/mcp_base/config.py TOML config loading — typed dataclasses for [server], [logging], and [filesystem] sections
src/mcp_base/logging.py Structured logger factory

Adding a tool

  1. Add a function to src/mcp_base/tools.py with a Google-style docstring and full type annotations.
  2. Import the function in src/mcp_base/server.py and register it with @mcp.tool().
  3. Add a unit test in tests/unit/.
  4. Add a row to the Available Tools table in this README.

Running Tests

uv run pytest tests/unit/

Contributing / Maintaining

See MAINTAINERS.md for setup, development commands, AI agent rails, and how to run tests.


License

Copyright (c) Sean Esopenko 2026

This project is licensed under the GNU General Public License v3.0.


Acknowledgement: Riding on the Backs of Giants

This project was built with the assistance of Claude Code, an AI coding assistant developed by Anthropic.

AI assistants like Claude are trained on enormous amounts of data — much of it written by the open-source community: the libraries, tools, documentation, and decades of shared knowledge that developers have contributed freely. Without that foundation, tools like this would not be possible.

In recognition of that debt, this project is released under the GNU General Public License v3.0. The GPL ensures that this code — and any derivative work — remains open source. It is a small act of reciprocity: giving back to the commons that made it possible.

To every developer who ever pushed a commit to a public repo, wrote a Stack Overflow answer, or published a package under an open license — thank you.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured