DataMaker MCP Server
Enables AI models to generate synthetic data, manage templates and connections, push data to destinations, and execute Python scripts using DataMaker's data generation platform. Automatically handles large datasets by storing them to S3 with secure access links.
README
DataMaker MCP Server
The Automators DataMaker MCP (Model Context Protocol) server provides a seamless integration between DataMaker and the Model Context Protocol, enabling AI models to interact with DataMaker's powerful data generation capabilities.
🚀 Features
- Generate synthetic data using DataMaker templates
- Fetch and manage DataMaker templates
- Fetch and manage DataMaker connections
- Push data to DataMaker connections
- Large dataset handling: Automatically stores large endpoint datasets to S3 and provides summary with view links
- Execute Python scripts: Dynamically execute Python code by saving scripts to S3 and running them using the DataMaker runner
📦 Installation
Add the following to your mcp.json file:
{
"mcpServers": {
"datamaker": {
"command": "npx",
"args": ["-y", "@automators/datamaker-mcp"],
"env": {
"DATAMAKER_API_KEY": "your-datamaker-api-key"
}
}
}
}
📋 Prerequisites
- Node.js (LTS version recommended)
- pnpm package manager (v10.5.2 or later)
- A DataMaker account with API access
- AWS S3 bucket and credentials (for large dataset storage)
🏃♂️ Usage
Large Dataset Handling
The get_endpoints tool automatically detects when a large dataset is returned (more than 10 endpoints) and:
- Stores the complete dataset to your configured S3 bucket
- Returns a summary showing only the first 5 endpoints
- Provides a secure link to view the complete dataset (expires in 24 hours)
This prevents overwhelming responses while maintaining access to all data.
Python Script Execution
The execute_python_script tool allows you to dynamically execute Python code:
- Saves the script to S3 using the
/upload-textendpoint - Executes the script using the DataMaker runner via the
/execute-pythonendpoint - Returns the execution output once the script completes
Usage Example:
# The tool accepts Python script code and a filename
execute_python_script(
script="print('Hello from DataMaker!')",
filename="hello.py"
)
This enables AI models to write and execute custom Python scripts for data processing, transformation, or any other computational tasks within the DataMaker environment.
Development Mode
Create a .env file in your project root. You can copy from env.example:
cp env.example .env
Then edit .env with your actual values:
DATAMAKER_URL="https://dev.datamaker.app"
DATAMAKER_API_KEY="your-datamaker-api-key"
# S3 Configuration (optional, for large dataset storage)
S3_BUCKET="your-s3-bucket-name"
S3_REGION="us-east-1"
S3_ACCESS_KEY_ID="your-aws-access-key"
S3_SECRET_ACCESS_KEY="your-aws-secret-key"
Run the server with the MCP Inspector for debugging:
pnpm dev
This will start the MCP server and launch the MCP Inspector interface at http://localhost:5173.
🔧 Available Scripts
pnpm build- Build the TypeScript codepnpm dev- Start the development server with MCP Inspectorpnpm changeset- Create a new changesetpnpm version- Update versions and changelogspnpm release- Build and publish the package
🚢 Release Process
This project uses Changesets to manage versions, create changelogs, and publish to npm. Here's how to make a change:
- Create a new branch
- Make your changes
- Create a changeset:
pnpm changeset - Follow the prompts to describe your changes
- Commit the changeset file along with your changes
- Push to your branch
- Create a PR on GitHub
The GitHub Actions workflow will automatically:
- Create a PR with version updates and changelog
- Publish to npm when the PR is merged
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
MIT License - See LICENSE for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.