CData Sync MCP Server
Enables AI assistants to manage CData Sync operations, including data synchronization jobs, connections, and ETL processes through stdio or HTTP transports. It provides tools for executing jobs, monitoring real-time progress via Server-Sent Events, and handling comprehensive workspace configurations.
README
CData Sync MCP Server
A comprehensive Model Context Protocol (MCP) server for CData Sync with dual transport support. This server exposes CData Sync's REST API as MCP tools, enabling AI assistants like Claude to manage data synchronization jobs, connections, and ETL operations.
Transport Options:
- stdio - For desktop usage with Claude Desktop app
- HTTP - For remote server deployments and API access
โจ Features
- ๐ง 20 Consolidated MCP Tools - Streamlined read/write operations for all entity types
- ๐ Dual Transport Support - Both stdio (Claude Desktop) and Streamable HTTP (web clients)
- ๐ก Real-time Notifications - Live monitoring of job executions and API calls via Server-Sent Events
- ๐๏ธ Production-Ready Architecture - TypeScript, error handling, logging, and comprehensive type safety
- ๐ Multiple Auth Methods - Support for API tokens and basic authentication
- ๐ Web Client Support - RESTful HTTP API with streaming capabilities
- ๐ Job Management - Execute, monitor, and control data sync jobs
- ๐ Connection Management - Test, create, and manage data connections
- ๐ฅ User Management - Handle user accounts and permissions
- ๐ History & Logging - Access execution history and detailed logs
๐ Quick Start
Prerequisites
- Node.js 18+
- CData Sync instance running
- Claude Desktop (for stdio transport) or web browser (for HTTP transport)
Installation
-
Clone the repository
git clone https://github.com/CDataSoftware/cdata-sync-mcp-server.git cd cdata-sync-mcp-server -
Install dependencies
npm install -
Build the project
npm run build -
Configure environment variables
# Copy the example environment file cp .env.example .env # Edit with your CData Sync details CDATA_BASE_URL="http://localhost:8181/api.rsc" CDATA_AUTH_TOKEN="your-auth-token" CDATA_WORKSPACE="your-workspace-uuid" # Optional: scope operations to specific workspace MCP_TRANSPORT_MODE="both" # stdio, http, or both
๐ Transport Options
Desktop Usage: Stdio Transport (Claude Desktop)
The stdio transport is designed for local desktop usage with the Claude Desktop app. This is the recommended approach for individual developers.
Configuration for Claude Desktop:
{
"mcpServers": {
"cdata-sync-server": {
"command": "node",
"args": ["/absolute/path/to/cdata-sync-mcp-server/dist/index.js"],
"env": {
"MCP_TRANSPORT_MODE": "stdio",
"CDATA_AUTH_TOKEN": "your-token-here",
"CDATA_BASE_URL": "http://localhost:8181/api.rsc",
"CDATA_WORKSPACE": "your-workspace-uuid-here",
"DISABLE_SSE": "true"
}
}
}
}
Start stdio-only server:
npm run start:stdio
Server Usage: HTTP Transport (Remote Deployments)
The HTTP transport is designed for server deployments where the MCP server runs on a remote machine and accepts API requests. This is ideal for:
- Team deployments
- Docker/Kubernetes environments
- Integration with web applications
- Remote access scenarios
Start HTTP-only server:
npm run start:http
Available endpoints:
GET /mcp/v1/info- Server and protocol informationGET /mcp/v1/health- Health checkPOST /mcp/v1/message- Send MCP requestsGET /mcp/v1/stream- Server-Sent Events for real-time updates
Example HTTP client usage:
// Connect to the server
const client = new MCPStreamableHttpClient('http://your-server:3000/mcp/v1');
await client.connect();
// List available tools
const tools = await client.listTools();
// Call a tool
const connections = await client.callTool('read_connections', {
action: 'list',
top: 5
});
// Set up real-time monitoring
client.onNotification = (method, params) => {
console.log('Notification:', method, params);
};
Development: Dual Transport
For development and testing, you can run both transports simultaneously:
npm run start:both
This is useful for testing both desktop and server scenarios during development.
๐ ๏ธ Available Tools
Connection Management
read_connections- List, count, get details, or test connectionswrite_connections- Create, update, or delete connectionsget_connection_tables- List tables in connectionget_table_columns- Get table schema information
Job Management
read_jobs- List, count, get details, status, history, or logswrite_jobs- Create, update, or delete jobsexecute_job- Run a sync job immediatelycancel_job- Stop running jobexecute_query- Run custom SQL queries
Task Management
read_tasks- List, count, or get task detailswrite_tasks- Create, update, or delete tasks
Transformation Management
read_transformations- List, count, or get transformation detailswrite_transformations- Create, update, or delete transformations
User Management
read_users- List, count, or get user detailswrite_users- Create or update users
Request/Log Management
read_requests- List, count, or get request log detailswrite_requests- Delete request logs
History Management
read_history- List or count execution history records
Certificate Management
read_certificates- List certificateswrite_certificates- Create certificates
Configuration Management
configure_sync_server- Get or update server configuration
๐ Tool Usage Patterns
Action-Based Operations
All read/write tools use an action parameter to specify the operation:
Example: Reading connections
{
"tool": "read_connections",
"arguments": {
"action": "list",
"filter": "contains(Name,'prod')",
"top": 10
}
}
Example: Creating a connection
{
"tool": "write_connections",
"arguments": {
"action": "create",
"name": "MyDatabase",
"providerName": "System.Data.SqlClient",
"connectionString": "Server=localhost;Database=test;"
}
}
Real-time Monitoring
The HTTP transport provides real-time notifications for:
- Tool execution start/completion
- Job execution progress
- Configuration changes
- Error notifications
// Monitor all server events
const eventSource = new EventSource('http://localhost:3000/mcp/v1/stream');
eventSource.onmessage = (event) => {
const message = JSON.parse(event.data);
if (message.method === 'notifications/job_executed') {
console.log('Job completed:', message.params);
}
};
๐ง Development
Development Scripts
# Start in development mode with both transports
npm run dev:both
# Start with stdio only
npm run dev:stdio
# Start with HTTP only
npm run dev:http
# Type checking
npm run typecheck
# Linting
npm run lint
npm run lint:fix
# Testing
npm test
npm run test:watch
npm run test:coverage
Environment Variables
| Variable | Description | Default |
|---|---|---|
CDATA_BASE_URL |
CData Sync API base URL | http://localhost:8181/api.rsc |
CDATA_AUTH_TOKEN |
API authentication token | - |
CDATA_USERNAME |
Basic auth username (alternative to token) | - |
CDATA_PASSWORD |
Basic auth password (alternative to token) | - |
CDATA_WORKSPACE |
Workspace UUID to scope all operations (optional) | - |
MCP_TRANSPORT_MODE |
Transport mode: stdio, http, or both |
stdio |
MCP_HTTP_PORT |
HTTP transport port | 3000 |
MCP_HTTP_PATH |
HTTP transport base path | /mcp/v1 |
NODE_ENV |
Node environment | production |
LOG_LEVEL |
Logging level | info |
๐ณ Deployment
Docker
# Build image
docker build -t cdata-sync-mcp-server .
# Run with stdio transport
docker run -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
# Run with HTTP transport
docker run -p 3000:3000 -e MCP_TRANSPORT_MODE=http -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
Docker Compose
# Start with Docker Compose
docker-compose up -d cdata-sync-mcp-both
Kubernetes
# Deploy to Kubernetes
kubectl apply -f k8s/
Systemd Service
# Install as systemd service
sudo cp cdata-sync-mcp.service /etc/systemd/system/
sudo systemctl enable cdata-sync-mcp
sudo systemctl start cdata-sync-mcp
๐ก HTTP API Reference
Protocol Information
GET /mcp/v1/info
{
"protocol": "Model Context Protocol",
"version": "2025-03-26",
"transport": "streamable-http",
"endpoints": {
"message": "http://localhost:3000/mcp/v1/message",
"stream": "http://localhost:3000/mcp/v1/stream"
}
}
Health Check
GET /mcp/v1/health
{
"status": "healthy",
"transport": "streamable-http",
"timestamp": "2024-01-15T10:30:00Z",
"pendingRequests": 0,
"bufferedMessages": 0
}
Send MCP Request
POST /mcp/v1/message
{
"jsonrpc": "2.0",
"id": "1",
"method": "tools/call",
"params": {
"name": "read_connections",
"arguments": {
"action": "list",
"top": 5
}
}
}
Real-time Events
GET /mcp/v1/stream
Server-Sent Events stream providing real-time notifications:
data: {"jsonrpc":"2.0","method":"notifications/tool_execution","params":{"tool":"read_connections","timestamp":"2024-01-15T10:30:00Z"}}
data: {"jsonrpc":"2.0","method":"notifications/job_executed","params":{"jobName":"TestJob","result":"success","timestamp":"2024-01-15T10:31:00Z"}}
๐งช Testing
Running Tests
# Run all tests
npm test
# Run with coverage
npm run test:coverage
# Watch mode for development
npm run test:watch
Test Structure
src/
โโโ __tests__/
โ โโโ services/ # Service unit tests
โ โโโ transport/ # Transport tests
โ โโโ integration/ # Integration tests
โ โโโ utils/ # Utility tests
๐ค Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Support
- Documentation: Full API documentation available in the docs directory
- Issues: Report bugs and request features via GitHub Issues
- Discussions: Community support via CData Community
๐ Additional Resources
Built with โค๏ธ for the MCP ecosystem
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.