Safe Local Python Executor/Interpreter

Safe Local Python Executor/Interpreter

Stdio MCP Server wrapping custom Python runtime (LocalPythonExecutor) from Hugging Faces' smolagents framework. The runtime combines the ease of setup (compared to docker, VM, cloud runtimes) while providing safeguards and limiting operations/imports that are allowed inside the runtime.

Category
Visit Server

Tools

run_python

Execute Python code in a secure sandbox environment. This tool allows running simple Python code for calculations and data manipulations. The execution environment is restricted for security purposes. Make sure you create a single file that can be executed in one go and it returns a result. Default allowed imports: - math - random - datetime - time - json - re - string - collections - itertools - functools - operator Args: code: The Python code to execute. Must be valid Python 3 code. The result must be stored in a variable called `result`. E.g.: ```python import math result = math.sqrt(16) ``` Returns: A dictionary with execution results containing: - result: The final value or None if no value is returned - logs: Any output from print statements

README

Safe Local Python Executor

An MCP server (stdio transport) that wraps Hugging Face's LocalPythonExecutor (from the smolagents framework). It is a custom Python runtime that provides basic isolation/security when running Python code generated by LLMs locally. It does not require Docker or VM. This package allows to expose the Python executor via MCP (Model Context Protocol) as a tool for LLM apps like Claude Desktop, Cursor or any other MCP compatible client. In case of Claude Desktop this tool is an easy way to add a missing Code Interpreter (available as a plugin in ChatGPT for quite a while already).

<img width="1032" alt="image" src="https://github.com/user-attachments/assets/3b820bfc-970a-4315-8f2d-970591c6fdae" />

Features

  • Exposes run_python tool
  • Safer execution of Python code compared to direct use of Python eva()l
  • Ran via uv in Python venv
  • No file I/O ops are allowed
  • Restricted list of imports
    • collections
    • datetime
    • itertools
    • math
    • queue
    • random
    • re
    • stat
    • statistics
    • time
    • unicodedata

Security

Be careful with execution of code produced by LLM on your machine, stay away from MCP servers that run Python via command line or using eval(). The safest option is using a VM or a docker container, though it requires some effort to set-up, consumes resources/slower. There're 3rd party servcices providing Python runtime, though they require registration, API keys etc.

LocalPythonExecutor provides a good balance between direct use of local Python environment (which is easier to set-up) AND remote execution in Dokcer container or a VM/3rd party service (which is safe). Hugginng Face team has invested time into creating a quick and safe option to run LLM generated code used by their code agents. This MCP server builds upon it:

To add a first layer of security, code execution in smolagents is not performed by the vanilla Python interpreter. We have re-built a more secure LocalPythonExecutor from the ground up.

Read more here.

Installation and Execution

  1. Install uv (e.h. brew install uv on macOS or use official docs)
  2. Clone the repo, change the directory cd mcp_safe_local_python_executor
  3. The server can be started via command line uv run mcp_server.py, venv will be created automatically, depedencies (smollagents, mcp) will be installed

Configuring Claude Desktop

  1. Make sure you have Claude for Desktop installed (download from claude.ai)

  2. Edit your Claude for Desktop configuration file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
    • Or open Claude Desktop -> Settings -> Developer -> click "Edit Config" button
  3. Add the following configuration:

{
    "mcpServers": {
        "safe-local-python-executor": {
            "command": "uv",
            "args": [
                "--directory", 
                "/path/to/mcp_local_python_executor/",
                "run",
                "mcp_server.py"
            ]
        }
    }
}
  1. Restart Claude for Desktop
  2. The Python executor tool will now be available in Claude (you'll see hammer icon in the message input field)

Example Prompts

Once configured, you can use prompts like:

  • "Calculate the factorial of 5 using Python"
  • "Create a list of prime numbers up to 100"
  • "Solve this equation (use Python): x^2 + 5x + 6 = 0"

Development

Clone the repo. Use uv to create venv, install dev dependencies, run tests:

uv venv .venv
uv sync --group dev
python -m pytest tests/

<a href="https://glama.ai/mcp/servers/@maxim-saplin/mcp_safe_local_python_executor"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@maxim-saplin/mcp_safe_local_python_executor/badge" /> </a>

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured