VTION E-Commerce MCP Server
Provides secure, read-only access to VTION e-commerce analytics data through MCP protocol. Enables AI agents to query PostgreSQL databases, discover schemas, and analyze product, order, and customer data with automatic query validation and parallel execution.
README
VTION E-Commerce MCP Server
A Model Context Protocol (MCP) server providing secure, read-only access to VTION e-commerce analytics data. Built with FastAPI and PostgreSQL, supporting both MCP native protocol and REST API.
Features
- MCP Protocol Support: Full implementation of Model Context Protocol for AI agent integration
- Multiple Transport Modes:
- FastMCP (stdio) for direct MCP client integration
- HTTP/SSE for web-based clients
- REST API for traditional HTTP clients
- Secure by Design: Read-only access, query validation, connection pooling
- Progressive Context Loading: Efficient data discovery with 4 context levels
- Parallel Query Execution: Multiple queries execute concurrently for optimal performance
- Auto-limiting: Raw queries limited to 5 rows, aggregated queries to 1,000 rows
- Rich Query Tools: Schema inspection, sample data, flexible querying
Architecture
VTION-ECOM/
├── vtion_ecom_mcp.py # Main MCP server with FastMCP
├── server.py # Standalone HTTP/SSE server
├── requirements.txt # Python dependencies
├── .env.example # Configuration template
├── .gitignore # Git ignore rules
└── README.md # This file
Quick Start
1. Installation
# Clone the repository
git clone <your-repo-url>
cd VTION-ECOM
# Install dependencies
pip install -r requirements.txt
2. Configuration
# Copy environment template
cp .env.example .env
# Edit .env with your database credentials
nano .env
Required Environment Variables:
DATASET_1_NAME=vtion_ecom
DATASET_1_DESC=VTION E-commerce platform analytics data
DATASET_1_CONNECTION=postgresql://postgres:PASSWORD@host:port/db?sslmode=require
DATASET_1_DICTIONARY={"table1":"desc","table2":"desc"}
3. Run the Server
Option A: FastMCP Mode (for MCP clients)
python vtion_ecom_mcp.py
Option B: HTTP/SSE Mode (for web clients)
python server.py
# Server runs on http://localhost:10000
Option C: Production Deployment
uvicorn server:app --host 0.0.0.0 --port 10000 --workers 4
Database Configuration
The MCP server connects to your Supabase PostgreSQL database. The connection string is already configured in .env.example:
postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require
Important: The password is URL-encoded (Vtion@2023# → Vtion%402023%23)
Expected Schema
The server works with any PostgreSQL schema. Common e-commerce tables include:
products- Product catalog with inventoryorders- Order history and transactionscustomers- Customer profiles and demographicscart_items- Shopping cart datauser_sessions- User engagement metrics
The server will automatically discover your schema at runtime.
Usage
MCP Tools
The server provides 5 MCP tools:
1. get_context(level, dataset_id?)
Progressive context loading:
- Level 0: Global rules and guidelines
- Level 1: List all datasets
- Level 2: Schema for specific dataset (requires dataset_id)
- Level 3: Full details with sample data (requires dataset_id)
# Get global rules
get_context(level=0)
# List all datasets
get_context(level=1)
# Get schema for dataset 1
get_context(level=2, dataset_id=1)
# Get full details with samples
get_context(level=3, dataset_id=1)
2. list_available_datasets()
List all configured datasets with metadata.
list_available_datasets()
3. get_dataset_schema(dataset_id)
Get complete schema for a dataset (equivalent to get_context(level=2)).
get_dataset_schema(dataset_id=1)
4. query_dataset(dataset_id, query, response_format?)
Execute SQL SELECT queries on a dataset.
# Simple query
query_dataset(
dataset_id=1,
query="SELECT * FROM products WHERE category = 'Electronics' LIMIT 10"
)
# Aggregated query
query_dataset(
dataset_id=1,
query="SELECT category, COUNT(*) as count, AVG(price) as avg_price FROM products GROUP BY category"
)
# JSON response format
query_dataset(
dataset_id=1,
query="SELECT * FROM orders WHERE status = 'completed'",
response_format="json"
)
Parallel Execution: Call query_dataset() multiple times - they execute in parallel automatically!
# These three queries execute concurrently:
query_dataset(1, "SELECT category, COUNT(*) FROM products GROUP BY category")
query_dataset(1, "SELECT status, COUNT(*) FROM orders GROUP BY status")
query_dataset(1, "SELECT gender, COUNT(*) FROM customers GROUP BY gender")
5. get_dataset_sample(dataset_id, table_name, limit?)
Get sample rows from a specific table.
get_dataset_sample(
dataset_id=1,
table_name="products",
limit=20
)
REST API Endpoints
When running server.py, these HTTP endpoints are available:
Health Check
curl http://localhost:10000/
# or
curl http://localhost:10000/health
Response:
{
"status": "ok",
"service": "VTION E-Commerce MCP Server",
"datasets": 1,
"version": "1.0",
"mcp_endpoint": "/mcp",
"mcp_protocol_version": "2025-06-18"
}
List Datasets
curl http://localhost:10000/datasets
Execute Query
curl -X POST http://localhost:10000/query \
-H "Content-Type: application/json" \
-d '{
"dataset_id": 1,
"query": "SELECT * FROM products LIMIT 5"
}'
MCP Protocol Endpoint
POST /mcp
Implements full MCP protocol over HTTP with JSON-RPC 2.0.
Security
Query Restrictions
- Only SELECT allowed: INSERT, UPDATE, DELETE, DROP, etc. are blocked
- Automatic limits: Raw queries max 5 rows, aggregated queries max 1,000 rows
- Connection pooling: Prevents resource exhaustion
- Timeout protection: 60-second query timeout
Authentication
⚠️ Important: This server does not include authentication. For production:
- Add authentication middleware (JWT, API keys, OAuth)
- Use environment-specific credentials
- Enable database row-level security (RLS)
- Run behind a reverse proxy (nginx, Cloudflare)
Development
Testing Connection
# Test database connectivity
python -c "
import asyncio
import asyncpg
async def test():
conn = await asyncpg.connect('postgresql://...')
print('Connected!')
tables = await conn.fetch('SELECT table_name FROM information_schema.tables WHERE table_schema = \\'public\\'')
print('Tables:', [t['table_name'] for t in tables])
await conn.close()
asyncio.run(test())
"
Adding Multiple Datasets
Edit .env to add more datasets:
# Dataset 1
DATASET_1_NAME=vtion_ecom
DATASET_1_CONNECTION=postgresql://...
DATASET_1_DESC=Main e-commerce data
DATASET_1_DICTIONARY={"products":"Product catalog"}
# Dataset 2
DATASET_2_NAME=analytics
DATASET_2_CONNECTION=postgresql://...
DATASET_2_DESC=Analytics warehouse
DATASET_2_DICTIONARY={"events":"User events"}
Customizing Business Logic
The server inherits business logic from indian-analytics-mcp:
- Query validation: Modify
query_dataset()invtion_ecom_mcp.py - Response formatting: Update
format_markdown_table()helper - Add custom tools: Use
@mcp.tool()decorator - Schema customization: Edit
DATASET_1_DICTIONARYin.env
Deployment
Render
- Create new Web Service
- Connect GitHub repository
- Set build command:
pip install -r requirements.txt - Set start command:
python server.py - Add environment variables from
.env
Docker
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
ENV PORT=10000
EXPOSE 10000
CMD ["python", "server.py"]
docker build -t vtion-mcp .
docker run -p 10000:10000 --env-file .env vtion-mcp
Railway / Fly.io
Both support automatic deployment from GitHub with environment variables.
Troubleshooting
Connection Issues
# Test database connection
psql "postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require"
No Datasets Found
Check environment variables are set:
env | grep DATASET_
Query Errors
- Verify table names with
get_dataset_schema() - Check column names match schema
- Ensure query is valid SQL SELECT statement
Import Errors
pip install --upgrade -r requirements.txt
Credits
Based on indian-analytics-mcp by @adityac7.
License
MIT License - see LICENSE file for details
Support
For issues and questions:
- GitHub Issues: <your-repo-url>/issues
- Email: support@vtion.com
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.