Mock Store MCP Server

Mock Store MCP Server

Enables AI agents to explore and query a mock e-commerce store's data including customers, products, inventory, and orders through conversational interactions backed by PostgreSQL.

Category
Visit Server

README

MCP Mock Store Example

This repository contains an end-to-end example of a fastMCP server that exposes a mock e-commerce store backed by a FastAPI application and a PostgreSQL database. It demonstrates how to share the same data source between a REST API and Model Context Protocol (MCP) tools so that conversational AI agents can explore store information such as customers, inventory, and orders.

Project layout

.
├── app/                  # FastAPI application (SQLAlchemy models, schemas, CRUD helpers)
├── mcp_server/           # fastMCP server exposing store data as tools
├── sql/                  # SQL scripts for schema and seed data
├── docker-compose.yml    # Local PostgreSQL instance with preloaded data
├── requirements.txt      # Python dependencies for both servers
└── .env.example          # Example environment variables

Prerequisites

  • Python 3.11+
  • Docker and Docker Compose (v2 or newer)
  • pip for installing Python dependencies

1. Start the database

docker compose up -d

The PostgreSQL container mounts the sql/ directory into /docker-entrypoint-initdb.d, so the schema (create_tables.sql) and sample data (seed_data.sql) are loaded automatically the first time the container starts.

2. Configure environment variables

Copy the example environment file and adjust it if you changed any credentials or hostnames:

cp .env.example .env

Both the FastAPI service and the fastMCP server read the DATABASE_URL environment variable. The default connection string assumes you are running locally with the docker-compose.yml configuration.

3. Install dependencies

python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

4. Run the FastAPI backend

uvicorn app.main:app --reload

Example endpoints

  • GET /customers – list all customers
  • GET /customers/{id} – retrieve a single customer
  • GET /products – browse available products
  • GET /inventory – inspect inventory levels
  • GET /orders – view orders including nested line items
  • GET /orders/{id} – fetch a specific order

5. Run the fastMCP server

python -m mcp_server

The server registers the following tools:

Tool name Description
list_customers Returns all customers and their metadata.
list_products Lists available products.
list_inventory Provides current inventory levels.
list_orders Retrieves orders with customer and line item data.
get_order Returns a single order by ID, or an error if missing.
get_store_summary Aggregates counts and high-level metrics.

Each tool responds with JSON derived from the same SQLAlchemy models used by the FastAPI backend, ensuring consistent representations across HTTP and MCP interfaces.

6. Connect from popular AI chatbots

Below are quick-start notes for common MCP-compatible clients. Substitute the path to your virtual environment's Python interpreter if different (e.g., .venv/bin/python).

Anthropic Claude Desktop

  1. Open Claude Desktop and navigate to Settings → Configure MCP Servers.
  2. Add a new server with:
    • Command: python
    • Arguments: -m mcp_server
    • Working directory: the root of this repository.
  3. Ensure the DATABASE_URL environment variable is available to Claude (e.g., by launching Claude from a shell session where it is exported).
  4. Claude can now call tools such as get_store_summary during conversations.

Cursor IDE

  1. Open Cursor and run the command Cursor: Configure MCP Servers.
  2. Create an entry with the command python and arguments -m mcp_server.
  3. Optionally specify environment variables via the configuration panel so the MCP server can reach the PostgreSQL instance.
  4. Use the “Connect MCP Server” command to make tools available in the chat sidebar.

VS Code + Continue

  1. Install the Continue extension (version 0.9.0+).

  2. Open the Continue settings (continue.json) and add:

    {
      "servers": [
        {
          "name": "mock-store",
          "command": "python",
          "args": ["-m", "mcp_server"],
          "cwd": "${workspaceFolder}",
          "env": {
            "DATABASE_URL": "postgresql+psycopg2://mcp_user:mcp_password@localhost:5432/mcp_store"
          }
        }
      ]
    }
    
  3. Restart Continue; the mock store tools will appear in the MCP tool palette.

OpenAI Desktop / ChatGPT Desktop (beta MCP support)

  1. Launch the client from a terminal with the virtual environment activated so the MCP server dependencies are available.
  2. In the MCP configuration UI, add a custom server pointing to python -m mcp_server.
  3. Use the UI to map environment variables or rely on your shell environment.

Tip: If a client requires an absolute path to the interpreter, run which python (Linux/macOS) or where python (Windows) inside the virtual environment and paste that path into the MCP configuration.

Database management

  • Reset data: stop the containers (docker compose down), delete the volume (docker volume rm mcp_postgres-data), and start again.

  • Manual migrations: you can rerun the SQL scripts with psql:

    psql postgresql://mcp_user:mcp_password@localhost:5432/mcp_store -f sql/create_tables.sql
    psql postgresql://mcp_user:mcp_password@localhost:5432/mcp_store -f sql/seed_data.sql
    

Testing the MCP tools manually

Once the server is running, you can issue direct requests with the fastmcp client utilities:

python -m fastmcp.client --command "get_store_summary"

Refer to the fastmcp documentation for more advanced usage such as streaming outputs or structured arguments.

License

This example is provided under the MIT license. Use it as a starting point for your own MCP-integrated services.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured