E2B Code Interpreter
Python & JS/TS SDK for running AI-generated code/code interpreting in your AI app - GitHub - lawrenciumLr103/code-interpreter: Python & JS/TS SDK for running AI-generated code/code interpreting in your AI app
lawrenciumLr103
README
<p align="center"> <img width="100" src="https://raw.githubusercontent.com/e2b-dev/E2B/refs/heads/main/readme-assets/logo-circle.png" alt="e2b logo"> </p>
<h4 align="center"> <a href="https://pypi.org/project/e2b/"> <img alt="Last 1 month downloads for the Python SDK" loading="lazy" width="200" height="20" decoding="async" data-nimg="1" style="color:transparent;width:auto;height:100%" src="https://img.shields.io/pypi/dm/e2b?label=PyPI%20Downloads"> </a> <a href="https://www.npmjs.com/package/e2b"> <img alt="Last 1 month downloads for the JavaScript SDK" loading="lazy" width="200" height="20" decoding="async" data-nimg="1" style="color:transparent;width:auto;height:100%" src="https://img.shields.io/npm/dm/e2b?label=NPM%20Downloads"> </a> </h4>
<!--- <img width="100%" src="/readme-assets/preview.png" alt="Cover image"> --->
What is E2B?
E2B is an open-source infrastructure that allows you run to AI-generated code in secure isolated sandboxes in the cloud. To start and control sandboxes, use our JavaScript SDK or Python SDK.
Run your first Sandbox
1. Install SDK
JavaScript / TypeScript
npm i @e2b/code-interpreter
Python
pip install e2b-code-interpreter
2. Get your E2B API key
E2B_API_KEY=e2b_***
3. Execute code with code interpreter inside Sandbox
JavaScript / TypeScript
import { Sandbox } from '@e2b/code-interpreter'
const sandbox = await Sandbox.create()
await sbx.runCode('x = 1')
const execution = await sbx.runCode('x+=1; x')
console.log(execution.text) // outputs 2
Python
from e2b_code_interpreter import Sandbox
with Sandbox() as sandbox:
sandbox.run_code("x = 1")
execution = sandbox.run_code("x+=1; x")
print(execution.text) # outputs 2
4. Check docs
Visit E2B documentation.
5. E2B cookbook
Visit our Cookbook to get inspired by examples with different LLMs and AI frameworks.
Recommended Servers
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
OpenRouter MCP Server
Provides integration with OpenRouter.ai, allowing access to various AI models through a unified interface.
Search1API MCP Server
A Model Context Protocol (MCP) server that provides search and crawl functionality using Search1API.
Supabase MCP Server (used by Deploya.dev)
Enables Cursor and Windsurf to safely interact with Supabase databases by providing tools for database management, SQL query execution, and Supabase Management API access with built-in safety controls.
Database Explorer
A powerful Model Context Protocol (MCP) tool for exploring and managing different types of databases including PostgreSQL, MySQL, and Firestore.