Agentic Retail MCP System

Agentic Retail MCP System

A full-stack AI application implementing a Model Context Protocol server for retail operations. It enables users to query inventory, analyze pricing elasticity, and get demand forecasts through a conversational interface connected to Firebase Firestore.

Category
Visit Server

README

Agentic Retail MCP System

A full-stack artificial intelligence application implementing a Model Context Protocol (MCP) server securely attached to a Firebase Firestore database, controlled via a Streamlit Chat interface and powered by OpenAI's gpt-4o.

System Architecture

This project is built using a decoupled Microservice pattern:

  1. MCP Backend Server (src/server.py): A fastmcp Python server exposing Retail-specific tools ['check_inventory', 'calculate_elasticity', 'get_demand_forecast']. Connects directly to Google Cloud Firestore.
  2. AI Streamlit Frontend (src/client.py): An asynchronous Web UI that communicates with your Backend Server using persistent Server-Sent Events (SSE). Parses tool inputs and loops them dynamically into OpenAI Function calls.

Prerequisites & Configuration

Before running any scripts, you must configure your .env file at the root of the project.

Create a .env file and define the following variables:

# Your OpenAI API Key for the GPT-4o Agent
OPENAI_API_KEY="sk-proj-YOUR-KEY-HERE"

# The absolute path to your Firebase Service Account JSON key
FIREBASE_CREDENTIALS_PATH="./firebase-key.json"

(If FIREBASE_CREDENTIALS_PATH is left blank, the system will attempt to utilize Google Cloud's Application Default Credentials natively).

Installation

This project utilizes uv as the lightning-fast python package dependency manager.

Run the following inside the root directory to ensure dependencies are mapped to Python >= 3.11:

uv init
uv add mcp pydantic firebase-admin python-dotenv openai streamlit

Database Initialization (Mock Data)

If you have a fresh Firebase Firestore instance, you can use the built-in seeding tools to populate your database with 100 sample retail products.

  1. Generate the static CSV mock data structures:
    uv run python generate_data.py
    
  2. Stream the generated CSV data directly into your Firebase NoSQL Database:
    uv run python seed_db.py
    

Running the Application

Because the Web Interface and the Tool Server communicate over an isolated local network loop, you must run both processes parallelly in two separate terminal tabs.

1. Boot up the Backend Tool Server

This stands up an ASGI Uvicorn app on port 8000 handling all tool execution requests natively via HTTP Server Sent Events.

uv run python src/server.py

2. Boot up the Streamlit Client

In a new terminal window, start the Streamlit conversational agent:

uv run streamlit run src/client.py 

A web browser tab will automatically open at http://localhost:8501. You can now ask questions like "What is the stock level of product P1002?" or "Analyze pricing elasticity projections for P1100".

Cloud Deployments (Google Cloud Run)

If you want to move your AI Agent and Backend infrastructure permanently to the cloud, I have configured advanced Dockerfiles designed explicitly for serverless deployment on Google Cloud Run.

Please refer to the complete DEPLOYMENT.md documentation guide for instructions on pushing both containers to the web.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured