Marketing Connect MCP Services

Marketing Connect MCP Services

A Model Context Protocol server for Marketing Connect AI integrations that provides tools, resources, and prompts for AI models to interact with marketing systems and data.

Category
Visit Server

README

Marketing Connect MCP Services

A Model Context Protocol (MCP) server for Marketing Connect AI integrations.

What is MCP?

The Model Context Protocol (MCP) is an open standard from Anthropic that enables AI models to securely interact with external tools and data sources. This server exposes:

  • Tools: Functions the AI can invoke (like API endpoints)
  • Resources: Data loaded into AI context (like configuration or schemas)
  • Prompts: Reusable interaction templates

Quick Start

Prerequisites

Install from Devshell:

  • Python 3.11+ (3.13 recommended)
  • make
  • buildi-cli
  • tfl
  • httpie

Installation

# Install uv package manager
make ci-prebuild

# Install all dependencies (creates .venv automatically)
make build

Running the Server

# Start the server (default: 0.0.0.0:8000)
make run

# Or with debug mode
make run-debug

# Or directly with uv
uv run marketing-connect-mcp --port 3000

Verify Deployment

The server exposes health check endpoints for deployment verification:

Endpoint Description
GET / Service overview
GET /health Health check (returns {"status": "UP"})
GET /info Server metadata (version, config, uptime)
POST /mcp MCP protocol endpoint (for AI clients)
# Check health
curl http://localhost:8000/health

# Get server info
curl http://localhost:8000/info

# Service overview
curl http://localhost:8000/

Testing the MCP Protocol

The MCP endpoint uses the Streamable HTTP transport and requires specific headers:

# Initialize MCP session
curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{
    "jsonrpc": "2.0",
    "id": 1,
    "method": "initialize",
    "params": {
      "protocolVersion": "2024-11-05",
      "capabilities": {},
      "clientInfo": {"name": "test-client", "version": "1.0"}
    }
  }'

Expected response (SSE format):

event: message
data: {"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","capabilities":{...},"serverInfo":{"name":"marketing-connect-mcp-services","version":"..."}}}
# List available tools
curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{
    "jsonrpc": "2.0",
    "id": 2,
    "method": "tools/list",
    "params": {}
  }'

Note: The MCP protocol is stateful. The initialize request works without a session, but subsequent requests like tools/list and tools/call require a session ID header (Mcp-Session-Id) from the initialization response. For full protocol testing, use an MCP client library

Project Structure

marketing-connect-mcp-services/
├── src/marketing_connect_mcp_services/
│   ├── __init__.py          # Package exports
│   ├── server.py            # FastMCP server setup
│   ├── config.py            # Pydantic settings
│   ├── cli.py               # CLI entry point
│   ├── tools/               # MCP tools (AI-invokable functions)
│   │   ├── __init__.py
│   │   └── example.py       # Example tool patterns
│   ├── resources/           # MCP resources (context data)
│   │   ├── __init__.py
│   │   └── example.py       # Example resource patterns
│   └── prompts/             # MCP prompts (interaction templates)
│       ├── __init__.py
│       └── example.py       # Example prompt patterns
├── tests/                   # Test suite
├── pyproject.toml           # Hatchling build config + dependencies
├── uv.lock                  # Dependency lock file
├── Makefile                 # Build commands
└── .env.example             # Environment template

Build System

This project uses modern Python tooling:

Tool Purpose
Hatchling Build backend (PEP 517)
uv Fast package manager (10-100x faster than pip)

Why uv?

  • Fast: Written in Rust, installs packages 10-100x faster than pip
  • Lock files: uv.lock ensures reproducible builds
  • Compatible: Works with standard pyproject.toml
  • Simple: Single binary, no plugins needed

Configuration

Configuration is managed via environment variables (prefix: MCP_).

Copy .env.example to .env and customize:

# Server identity
MCP_SERVER_NAME=marketing-connect-mcp-services
MCP_SERVER_VERSION=1.0.0

# HTTP server
MCP_HOST=0.0.0.0
MCP_PORT=8000

# Logging
MCP_DEBUG=false
MCP_LOG_LEVEL=INFO

# Application settings
MCP_BASE_URL=https://your-app-url.com
MCP_REGION=us-east-1

JPMC Artifact Repository

The PyPI index is configured in pyproject.toml:

[tool.uv]
index-url = "https://artifacts-read.gkp.jpmchase.net/artifactory/api/pypi/pypi/simple"
extra-index-url = ["https://pypi.org/simple"]

You can also override via environment variable:

export UV_INDEX_URL=https://your-pypi-mirror.com/simple

Development

Testing

# Run tests
make test

# Run tests with coverage
make cover

# Verbose output
make test-verbose

Code Quality

# Format code
make format

# Lint code
make lint

# Auto-fix lint issues
make lint-fix

# Type check
make typecheck

# Run all checks
make check

Pre-commit Hooks

make precommit

Dependency Management

# Update lock file
make lock

# Update all dependencies to latest
make update

# Install production deps only
make build-prod

Adding Custom Integrations

Adding a Tool

Create a new file in tools/ and register it:

# tools/my_tools.py
from marketing_connect_mcp_services.server import mcp

@mcp.tool()
async def my_custom_tool(param: str) -> str:
    """Description the AI will see."""
    return f"Result: {param}"

Then import in server.py:

from marketing_connect_mcp_services.tools import my_tools  # noqa: F401

Adding a Resource

# resources/my_resources.py
from marketing_connect_mcp_services.server import mcp

@mcp.resource("myapp://config")
async def get_config() -> str:
    """Returns configuration data."""
    return "config data"

Adding a Prompt

# prompts/my_prompts.py
from marketing_connect_mcp_services.server import mcp

@mcp.prompt()
async def analysis_prompt(topic: str) -> str:
    """Generate an analysis prompt."""
    return f"Please analyze: {topic}"

CI/CD

# Full CI pipeline (clean, build, test, package)
make ci

# Generate SSAP reports
make ssap

# Build wheel package
make package

Make Targets

Target Description
make run Start the MCP server
make run-debug Start with debug logging
make build Install all dependencies
make build-prod Install production deps only
make test Run tests
make cover Run tests with coverage
make format Format code
make lint Lint code
make typecheck Run mypy type checking
make check Run lint + typecheck + test
make lock Update uv.lock
make update Update all dependencies
make ci Full CI pipeline
make ssap Generate SSAP reports
make package Build wheel
make help Show all targets

Documentation

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured