Enterprise MCP Template
A production-ready framework for building enterprise-grade MCP servers featuring integrated OAuth 2.0 authentication and modular architectural patterns. It simplifies the creation of secure, standardized tools that allow AI assistants to interact with complex upstream platforms like Salesforce and NetSuite.
README
Enterprise MCP Template
A production-ready template for building enterprise-grade MCP (Model Context Protocol) servers with OAuth 2.0 authentication, based on battle-tested patterns from the Luxsant NetSuite MCP project.
What is MCP? MCP is a standard protocol that lets AI assistants (Claude, Copilot, etc.) call "tools" (functions) on remote servers. Think of it as a standardized API that AI models know how to use.
Table of Contents
- Quick Start
- Architecture Overview
- Project Structure
- How to Create a New MCP Server
- OAuth 2.0 Authentication Deep Dive
- Libraries & Dependencies
- Configuration System
- MCP Tools Pattern
- API Client Pattern
- Token Management
- Exception Hierarchy
- Deployment Guide
- Testing
- Best Practices & Gotchas
- Troubleshooting
Quick Start
1. Clone and rename
git clone https://github.com/YOUR_USER/enterprise-mcp-template.git my-cool-mcp
cd my-cool-mcp
2. Rename the package
# Rename the source directory
mv src/my_mcp_server src/my_cool_mcp
# Find and replace all occurrences:
# "my_mcp_server" -> "my_cool_mcp"
# "my-mcp-server" -> "my-cool-mcp"
# "{{PROJECT_NAME}}" -> "My Cool MCP"
# "{{AUTHOR}}" -> "Your Name"
3. Configure environment
cp .env.example .env
# Edit .env with your upstream API credentials
4. Install and run
# Create virtual environment
python -m venv venv
source venv/bin/activate # Linux/Mac
# or: venv\Scripts\activate # Windows
# Install dependencies
pip install -e ".[dev]"
# Run locally (stdio mode for Claude Desktop)
python -m my_cool_mcp
# Run as HTTP server
python -m my_cool_mcp http
# Run tests
pytest
5. Deploy
# Docker build
docker compose up --build
# Or deploy to Azure Web App
az webapp up --name my-cool-mcp --runtime PYTHON:3.11
Architecture Overview
AI Client (Claude Desktop / VS Code / Custom)
|
| MCP Protocol (stdio / SSE / HTTP)
|
+---v----------------------------------------------+
| MCP Server (server.py) |
| +--------------------------------------------+ |
| | OAuth 2.0 Proxy (OAuthProxy) | |
| | - Handles user authentication | |
| | - Manages proxy tokens | |
| | - Token exchange with upstream | |
| +--------------------------------------------+ |
| +--------------------------------------------+ |
| | MCP Tools (@mcp.tool() functions) | |
| | - create_record() | |
| | - get_record() | |
| | - update_record() | |
| | - delete_record() | |
| | - execute_query() | |
| +--------------------------------------------+ |
| +--------------------------------------------+ |
| | HTTP Routes (/health, /debug/*) | |
| +--------------------------------------------+ |
+--------------------------------------------------+
|
| HTTPS + Bearer Token
|
+---v----------------------------------------------+
| API Client (api_client.py) |
| - HTTP requests with retry logic |
| - Response parsing |
| - Error handling |
+--------------------------------------------------+
|
| REST API calls
|
+---v----------------------------------------------+
| Upstream Service (NetSuite, Salesforce, etc.) |
+--------------------------------------------------+
Module Dependency Flow
__main__.py / wsgi.py
-> server.py (main server, tools, OAuth, routes)
-> api_client.py (HTTP client for upstream API)
-> config.py (environment configuration)
-> models.py (Pydantic data models)
-> exceptions.py (error hierarchy)
-> auth.py (token caching & refresh)
-> config.py
-> exceptions.py
-> utils.py (logging, sanitization, helpers)
Project Structure
enterprise-mcp-template/
|-- .env.example # Environment variable template
|-- .gitignore # Git ignore rules
|-- docker-compose.yml # Docker Compose for local dev
|-- Dockerfile # Multi-stage production Docker build
|-- LICENSE # MIT License
|-- main.py # Root smoke test (not the entry point)
|-- pyproject.toml # Python project configuration
|-- README.md # This file
|-- CLAUDE.md # AI agent instructions
|-- requirements.txt # Production dependencies
|-- startup.sh # Azure Web App startup script
|
|-- docs/ # Documentation
| |-- guide.pdf # PDF version of this guide
|
|-- samples/ # Example payloads
| |-- example_payload.json # Sample API request payload
|
|-- src/
| |-- my_mcp_server/ # Main package (RENAME THIS)
| |-- __init__.py # Package init with lazy imports
| |-- __main__.py # CLI entry point (python -m my_mcp_server)
| |-- server.py # *** MAIN FILE *** MCP server + tools + OAuth
| |-- api_client.py # HTTP client for upstream API
| |-- auth.py # Token management (LRU cache + refresh)
| |-- config.py # Environment-based configuration
| |-- models.py # Pydantic data models
| |-- exceptions.py # Exception hierarchy
| |-- utils.py # Utility functions
| |-- wsgi.py # ASGI entry point for production
| |-- static/
| |-- index.html # Browser-friendly status page
|
|-- tests/ # Test suite
|-- __init__.py
|-- test_config.py # Config tests
|-- test_models.py # Model tests
|-- test_auth.py # Auth/token tests
How to Create a New MCP Server
Step 1: Global Find & Replace
| Find | Replace With | Example |
|---|---|---|
my_mcp_server |
Your package name (snake_case) | salesforce_mcp |
my-mcp-server |
Your package name (kebab-case) | salesforce-mcp |
{{PROJECT_NAME}} |
Display name | Salesforce MCP Enterprise |
{{AUTHOR}} |
Your name/org | El Paso Labs |
UPSTREAM_ |
Your service prefix | SALESFORCE_ |
example.com |
Your API domain | salesforce.com |
Step 2: Update OAuth Endpoints (server.py)
In _build_auth_provider(), update:
# BEFORE (template):
auth_endpoint = f"https://{account_id}.app.example.com/oauth2/authorize"
token_endpoint = f"https://{account_id}.api.example.com/oauth2/token"
api_scopes = ["api_access"]
# AFTER (example for NetSuite):
auth_endpoint = f"https://{account_id}.app.netsuite.com/app/login/oauth2/authorize.nl"
token_endpoint = f"https://{account_id}.suitetalk.api.netsuite.com/services/rest/auth/oauth2/v1/token"
api_scopes = ["rest_webservices"]
Step 3: Update API URL Patterns (config.py, api_client.py)
In config.py UpstreamAPIConfig.build_api_base_url():
# BEFORE:
return f"https://{self.account_id}.api.example.com/v1"
# AFTER (NetSuite):
return f"https://{self.account_id}.suitetalk.api.netsuite.com/services/rest/record/v1"
Step 4: Define Your MCP Tools (server.py)
Replace the generic CRUD tools with domain-specific ones:
@mcp.tool()
async def create_customer(
customer_data: Dict[str, Any],
account_id: Optional[str] = None,
) -> Dict[str, Any]:
"""
Create a new customer in Salesforce.
Args:
customer_data: Customer fields (Name, Email, Phone, etc.)
account_id: Salesforce org ID
Returns:
Structured response with the created customer's ID.
"""
token = _get_oauth_token()
async with _get_client(account_id=account_id) as client:
response = await client.create_record(
access_token=token,
record_type="customer",
payload=customer_data,
)
return _serialize_response(response)
Step 5: Update Models (models.py)
Replace example models with your domain entities:
class CustomerPayload(BaseModel):
name: str = Field(..., description="Customer name")
email: Optional[str] = Field(default=None)
phone: Optional[str] = Field(default=None)
# ... your fields
Step 6: Test and Deploy
# Run tests
pytest
# Local HTTP test
python -m your_package http
# Visit http://localhost:8000/health
# Docker
docker compose up --build
OAuth 2.0 Authentication Deep Dive
How OAuth Works in This Template
1. AI Client connects to MCP server
|
2. MCP server redirects user to upstream login page
| (via OAuthProxy)
|
3. User logs in at upstream service (NetSuite, Salesforce, etc.)
|
4. Upstream redirects back with authorization code
| -> https://your-server.com/auth/callback?code=ABC123
|
5. OAuthProxy exchanges code for access token (server-to-server)
| POST to token endpoint with client_id + client_secret
|
6. OAuthProxy stores the real token, gives client a proxy token
|
7. Client sends proxy token with each MCP tool call
|
8. OAuthProxy looks up real token, passes to tool function
|
9. Tool function uses real token to call upstream API
Critical OAuth Configuration
auth = OAuthProxy(
# WHERE users log in
upstream_authorization_endpoint=auth_endpoint,
# WHERE we exchange codes for tokens
upstream_token_endpoint=token_endpoint,
# OUR app's credentials
upstream_client_id=client_id,
upstream_client_secret=client_secret,
# HOW we verify proxy tokens
token_verifier=token_verifier,
# PUBLIC URL for callbacks
base_url=base_url,
# HOW we send credentials to token endpoint
# "client_secret_basic" = Authorization header (most APIs)
# "client_secret_post" = POST body parameters
token_endpoint_auth_method="client_secret_basic",
# PKCE handling - CRITICAL!
# Set to False if upstream handles PKCE with browser directly
# Set to True if you need to forward PKCE params
forward_pkce=False,
# OAuth scopes
valid_scopes=api_scopes,
# Accept any MCP client redirect URI
allowed_client_redirect_uris=None,
# Sign proxy JWTs with a stable key (set MCP_JWT_SIGNING_KEY in prod!)
jwt_signing_key=jwt_signing_key,
# Skip our consent screen (upstream has its own)
require_authorization_consent=False,
# In-memory client storage (resets on restart - intentional)
client_storage=client_storage,
)
OAuth Gotchas (Lessons Learned)
-
forward_pkce=False: If your upstream API handles PKCE between itself and the browser, do NOT forward your own PKCE parameters. Your server'scode_verifierwon't match the browser'scode_challenge, causinginvalid_granterrors. -
required_scopeson DebugTokenVerifier: Without this, clients registered via DCR getscope=""and ALL scope requests are rejected withinvalid_scopebefore reaching the upstream. -
MCP_JWT_SIGNING_KEY: Without a stable key, the OAuthProxy generates a random key on each startup. Container restarts invalidate ALL proxy tokens. Always set in production. -
MemoryStorefor client storage: Resets on restart. This is actually GOOD - prevents stale client registrations from previous deployments. -
token_endpoint_auth_method: Test both "client_secret_basic" and "client_secret_post" using the/debug/token-testendpoint. The wrong method givesinvalid_clientinstead ofinvalid_grant.
Libraries & Dependencies
| Library | Version | Purpose | Why This Library |
|---|---|---|---|
| fastmcp | >=3.0.0b2 | MCP framework | Only production-grade MCP framework. Handles protocol, OAuth, transport. |
| httpx | >=0.27.0 | HTTP client | Async HTTP client with connection pooling. Superior to requests for async. |
| pydantic | >=2.0.0 | Data validation | Industry standard. Auto-validation, serialization, IDE support. |
| pydantic-settings | >=2.1.0 | Settings management | Pydantic extension for env var parsing. |
| python-dotenv | >=1.0.0 | .env file loading | Loads .env files for local development. |
| loguru | >=0.7.2 | Logging | Enhanced logging (optional, can use stdlib). |
| gunicorn | >=21.2.0 | Process manager | Production WSGI/ASGI server. Multi-worker, graceful restarts. |
| uvicorn | >=0.27.0 | ASGI server | High-performance async HTTP server. Used as gunicorn worker class. |
Why FastMCP 3.0?
FastMCP 3.0 is the only production-grade MCP framework available. Key features:
- Native
host/portsupport in.run() - Built-in
OAuthProxyfor OAuth 2.0 authentication DebugTokenVerifierfor development/testingget_access_token()dependency injection- Support for three transports: stdio, SSE, HTTP
@mcp.tool()decorator for registering tools@mcp.custom_route()for HTTP endpoints- Stateless HTTP mode for cloud load balancers
Why httpx over requests?
- Async support:
httpx.AsyncClientworks natively withasync/await - Connection pooling: Reuses TCP connections automatically
- Timeout control: Granular timeout settings per request
- HTTP/2 support: Optional HTTP/2 for better performance
- requests-compatible API: Easy to migrate from requests
Configuration System
All configuration uses environment variables following the 12-Factor App methodology.
Configuration Hierarchy
AppConfig
├── UpstreamAPIConfig (API connection: URL, credentials, timeouts)
├── TokenStoreConfig (Token caching: LRU size, expiry buffer)
└── ServerConfig (Server: name, transport, host, port)
Key Environment Variables
| Variable | Required | Default | Description |
|---|---|---|---|
UPSTREAM_ACCOUNT_ID |
Yes* | - | Account/tenant identifier |
UPSTREAM_OAUTH_CLIENT_ID |
Yes* | - | OAuth client ID |
UPSTREAM_OAUTH_CLIENT_SECRET |
Yes* | - | OAuth client secret |
MCP_SERVER_BASE_URL |
Yes* | - | Public URL for OAuth callbacks |
MCP_TRANSPORT |
No | stdio |
Transport: stdio/sse/http |
MCP_PORT |
No | 8000 |
Server port |
MCP_HOST |
No | 0.0.0.0 |
Server host binding |
TOKEN_CACHE_ENABLED |
No | true |
Enable token LRU cache |
TOKEN_EXPIRY_BUFFER_SECS |
No | 300 |
Refresh buffer (seconds) |
MCP_JWT_SIGNING_KEY |
No | random | Stable JWT key for production |
LOG_LEVEL |
No | INFO |
DEBUG/INFO/WARNING/ERROR |
DEBUG |
No | false |
Enable debug mode |
*Required for OAuth authentication. Server runs without auth if missing.
Singleton Pattern
from config import get_config, set_config, reset_config
# Normal usage (reads env vars once, caches globally)
config = get_config()
base_url = config.upstream.build_api_base_url()
# Testing (override with custom config)
set_config(AppConfig(server=ServerConfig(port=9999)))
# Reset (force re-read from env)
reset_config()
MCP Tools Pattern
Every MCP tool follows this exact pattern:
@mcp.tool()
async def my_tool(
required_param: str,
optional_param: Optional[str] = None,
account_id: Optional[str] = None,
base_url: Optional[str] = None,
) -> Dict[str, Any]:
"""
Tool description (AI reads this to decide when to use the tool).
Args:
required_param: Description for AI
optional_param: Description for AI
account_id: Account ID (if not preconfigured)
base_url: Override API URL
Returns:
Structured response dict with ok, status_code, data, errors.
"""
# 1. Get OAuth token from MCP session
token = _get_oauth_token()
# 2. Create API client (async context manager for cleanup)
async with _get_client(base_url, account_id) as client:
# 3. Call the appropriate client method
response = await client.some_method(
access_token=token,
...
)
# 4. Serialize and return
return _serialize_response(response)
Rules for MCP Tools
- Return simple Python objects (dict, list, str, number). They're serialized to JSON.
- Docstrings matter: AI reads them to decide when/how to use the tool.
- Parameter types matter: FastMCP generates JSON Schema from type hints.
- Always use
_serialize_response(): Provides consistent response format. - Always use
async with: Ensures HTTP client cleanup on error. - Add
account_idandbase_urlparams: Lets AI clients specify targets dynamically.
API Client Pattern
The API client (api_client.py) handles all HTTP communication:
async with APIClient(base_url="https://api.example.com/v1") as client:
# Generic CRUD
response = await client.create_record(token, "customer", payload)
response = await client.get_record(token, "customer", "123")
response = await client.update_record(token, "customer", "123", updates)
response = await client.delete_record(token, "customer", "123")
# Query (if your API supports it)
response = await client.execute_query(token, "SELECT * FROM Customer")
Retry Logic
Attempt 1: Immediate
Attempt 2: Wait 0.5s (backoff_factor * 2^0)
Attempt 3: Wait 1.0s (backoff_factor * 2^1)
Attempt 4: Wait 2.0s (backoff_factor * 2^2)
Retries on: 429, 500, 502, 503, 504, timeouts, connection errors.
Does NOT retry: 400, 401, 403, 404.
Token Management
LRU Token Cache
Token Cache (max 100 entries)
+---------+------------------+-----------+
| Key | Token | Expires |
+---------+------------------+-----------+
| sha256 | eyJhbG... | 1hr | <- Most recently used
| sha256 | eyJxyz... | 45min |
| sha256 | eyJabc... | 30min |
| ... | ... | ... |
| sha256 | eyJold... | 10min | <- Least recently used (evicted first)
+---------+------------------+-----------+
Token Lifecycle
1. User authenticates -> access_token + refresh_token
2. Token cached with SHA-256 key
3. On each API call: check if cached token is still valid
4. If expired (with 5-min buffer): attempt refresh
5. If refresh succeeds: cache new token
6. If refresh fails: user must re-authenticate
Exception Hierarchy
MCPServerError (catch-all)
├── ConfigurationError
│ ├── MissingConfigurationError
│ └── InvalidConfigurationError
├── AuthenticationError
│ ├── TokenError
│ │ ├── TokenExpiredError
│ │ ├── TokenRefreshError
│ │ └── TokenValidationError
│ └── InvalidCredentialsError
├── APIError
│ ├── ConnectionError
│ ├── TimeoutError
│ ├── RateLimitError
│ ├── NotFoundError
│ ├── ValidationError
│ ├── PermissionError
│ └── ServerError
└── RecordError
├── RecordNotFoundError
├── RecordValidationError
└── DuplicateRecordError
Every exception has to_dict() for JSON serialization and a machine-readable code field.
Deployment Guide
Local Development (stdio)
python -m my_mcp_server
# Communicates via stdin/stdout - used by Claude Desktop
Local HTTP Server
python -m my_mcp_server http
# Available at http://localhost:8000
# Health: http://localhost:8000/health
# MCP: http://localhost:8000/mcp
Docker
# Build and run
docker compose up --build
# Or standalone
docker build -t my-mcp .
docker run -p 8000:8000 --env-file .env my-mcp
Azure Web App
# Option 1: Container deployment
az webapp create --name my-mcp --plan my-plan --deployment-container-image-name my-mcp:latest
# Option 2: Source deployment
az webapp up --name my-mcp --runtime PYTHON:3.11
# Set environment variables in Azure Portal:
# Settings -> Configuration -> Application settings
Required Azure settings:
- All
UPSTREAM_*env vars MCP_SERVER_BASE_URL=https://my-mcp.azurewebsites.netMCP_TRANSPORT=httpMCP_JWT_SIGNING_KEY=<generate with: python -c "import secrets; print(secrets.token_hex(32))">
Claude Desktop Configuration
Add to claude_desktop_config.json:
{
"mcpServers": {
"my-mcp": {
"url": "https://my-mcp.azurewebsites.net/mcp"
}
}
}
Testing
# Run all tests
pytest
# With coverage
pytest --cov=my_mcp_server --cov-report=html
# Specific test file
pytest tests/test_config.py -v
# Run with verbose output
pytest -v -s
Test Structure
test_config.py- Environment parsing, config validation, singletontest_models.py- Pydantic model validation, serialization, factoriestest_auth.py- Token caching, expiry checking, LRU eviction
Best Practices & Gotchas
DO
- Always use
async withfor API clients - ensures HTTP connection cleanup - Always sanitize sensitive data before logging - use
sanitize_for_logging() - Always return
APIResponsefrom tools - consistent interface for AI clients - Set
MCP_JWT_SIGNING_KEYin production - prevents token invalidation on restart - Log to stderr, not stdout - stdout is reserved for MCP protocol in stdio mode
- Use UTC for all timestamps -
datetime.now(timezone.utc) - Add
account_idparameter to tools - lets AI specify targets dynamically - Write descriptive docstrings - AI reads them to decide tool usage
- Use environment variables for ALL config - never hardcode credentials
DON'T
- Don't log raw tokens - use
mask_token()helper - Don't hardcode API URLs - use config.py and env vars
- Don't catch bare
Exception- use the exception hierarchy - Don't use
requestslibrary - usehttpxfor async support - Don't run on stdout in stdio mode - it corrupts MCP protocol
- Don't skip the token expiry buffer - tokens can expire mid-request
- Don't use
functools.lru_cachefor tokens - need expiry-aware eviction - Don't forward PKCE if upstream handles it - causes
invalid_grant
Troubleshooting
OAuth Issues
- Visit
/health- shows if OAuth is configured and which env vars are set - Visit
/debug/logs?filter=oauth- shows OAuth flow logs - Visit
/debug/token-test- tests both auth methods against upstream - Visit
/debug/server-info- shows if container restarted (lost OAuth state)
Common Errors
| Error | Cause | Fix |
|---|---|---|
invalid_grant |
PKCE mismatch or expired code | Set forward_pkce=False |
invalid_client |
Wrong auth method or credentials | Try both auth methods via /debug/token-test |
invalid_scope |
Missing required_scopes on verifier |
Add required_scopes to DebugTokenVerifier |
No authenticated session |
User not logged in | Connect via MCP client with OAuth support |
| Token invalidated on restart | No stable JWT key | Set MCP_JWT_SIGNING_KEY env var |
Debug Endpoints
| Endpoint | Purpose |
|---|---|
GET /health |
Server status, config, OAuth info |
GET /debug/logs |
Recent server logs (in-memory buffer) |
GET /debug/logs?filter=oauth |
OAuth-specific logs |
GET /debug/server-info |
Instance ID, uptime, OAuth state counts |
GET /debug/token-test |
Test token exchange with upstream |
License
MIT License - See LICENSE for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.