PRIMS – Python Runtime Interpreter MCP Server
Enables LLM agents to execute arbitrary Python code in secure, isolated sandbox environments with automatic dependency management and file handling. Each code execution runs in a fresh virtual environment that gets destroyed after completion, providing safe and reproducible Python code execution.
README
<p align="left"> <img src="primslogo.png" alt="PRIMS Logo" width="200"/> <a href="#"><img src="https://img.shields.io/badge/status-alpha-orange?style=for-the-badge" alt="Status: Alpha"/></a> <a href="LICENSE"><img src="https://img.shields.io/badge/license-MIT-blue?style=for-the-badge" alt="License: MIT"/></a> </p>
PRIMS – Python Runtime Interpreter MCP Server
PRIMS is a tiny open-source Model Context Protocol (MCP) server that lets LLM agents run arbitrary Python code in a secure, throw-away sandbox.
• One tool, one job. Exposes a single MCP tool – run_code – that executes user-supplied Python and streams back stdout / stderr.
• Isolated & reproducible. Each call spins up a fresh virtual-env, installs any requested pip packages, mounts optional read-only files, then nukes the workspace.
• Zero config. Works over MCP/stdio or drop it in Docker.
Quick-start
1. Local development environment
chmod +x scripts/setup_env.sh # once, to make the script executable
./scripts/setup_env.sh # creates .venv & installs deps
# activate the venv in each new shell
source .venv/bin/activate
2. Launch the server
python -m server.main # binds http://0.0.0.0:9000/mcp
3. Docker
# Quick one-liner (build + run)
chmod +x scripts/docker_run.sh
./scripts/docker_run.sh # prints the MCP URL when ready
Examples
List available tools
You can use the provided script to list all tools exposed by the server:
python examples/list_tools.py
Expected output (tool names and descriptions may vary):
Available tools:
- run_code: Execute Python code in a secure sandbox with optional dependencies & file mounts.
- list_dir: List files/directories in your session workspace.
- preview_file: Preview up to 8 KB of a text file from your session workspace.
- persist_artifact: Upload an output/ file to a presigned URL for permanent storage.
- mount_file: Download a remote file once per session to `mounts/<path>`.
Run code via the MCP server
python examples/run_code.py
Mount a dataset once & reuse it
python examples/mount_and_run.py
This mounts a CSV with mount_file and then reads it inside run_code without re-supplying the URL.
Inspect your session workspace
python examples/inspect_workspace.py
This shows how to use the list_dir and preview_file tools to browse files your code created.
Persist an artifact to permanent storage
The persist_artifact tool uploads a file from your output/ directory to a presigned URL.
Example (Python):
await client.call_tool("persist_artifact", {
"relative_path": "plots/plot.png",
"presigned_url": "https://bucket.s3.amazonaws.com/...signature...",
})
Download an artifact
Small artifacts can be fetched directly:
curl -H "mcp-session-id: <your-session-id>" \
http://localhost:9000/artifacts/plots/plot.png -o plot.png
Available tools
| Tool | Purpose |
|---|---|
run_code |
Execute Python in an isolated sandbox with optional pip deps. |
list_dir |
List files/directories inside your session workspace. |
preview_file |
Return up to 8 KB of a text file for quick inspection. |
persist_artifact |
Upload an output/ file to a client-provided presigned URL. |
mount_file |
Download a remote file once per session to mounts/<path>. |
See the examples/ directory for end-to-end demos.
Contributing
Contributions are welcome! Feel free to open issues, suggest features, or submit pull requests to help improve PRIMS.
If you find this project useful, please consider leaving a ⭐ to show your support.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.