MCP Code Executor
Allows LLMs to execute Python code in a specified Conda environment, enabling access to necessary libraries and dependencies for efficient code execution.
bazinga012
Tools
execute_code
Execute Python code in the specified conda environment
README
MCP Code Executor
The MCP Code Executor is an MCP server that allows LLMs to execute Python code within a specified Conda environment. This enables LLMs to run code with access to libraries and dependencies defined in the Conda environment.
<a href="https://glama.ai/mcp/servers/45ix8xode3"><img width="380" height="200" src="https://glama.ai/mcp/servers/45ix8xode3/badge" alt="Code Executor MCP server" /></a>
Features
- Execute Python code from LLM prompts
- Run code within a specified Conda environment
- Configurable code storage directory
Prerequisites
- Node.js installed
- Conda installed
- Desired Conda environment created
Setup
- Clone this repository:
git clone https://github.com/bazinga012/mcp_code_executor.git
- Navigate to the project directory:
cd mcp_code_executor
- Install the Node.js dependencies:
npm install
- Build the project:
npm run build
Configuration
To configure the MCP Code Executor server, add the following to your MCP servers configuration file:
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
Replace the placeholders:
/path/to/mcp_code_executor
with the absolute path to where you cloned this repository/path/to/code/storage
with the directory where you want the generated code to be storedyour-conda-env
with the name of the Conda environment you want the code to run in
Usage
Once configured, the MCP Code Executor will allow LLMs to execute Python code by generating a file in the specified CODE_STORAGE_DIR
and running it within the Conda environment defined by CONDA_ENV_NAME
.
LLMs can generate and execute code by referencing this MCP server in their prompts.
Contributing
Contributions are welcome! Please open an issue or submit a pull request.
License
This project is licensed under the MIT License.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.