AI Workspace MCP Server

AI Workspace MCP Server

Provides a secure workspace for AI-driven file management and Python script execution designed to run on Vercel. It enables tools for full file operations and code execution within a sandboxed environment.

Category
Visit Server

README

AI Workspace MCP Server

A Model Context Protocol (MCP) server that provides AI with a secure workspace for file management and Python script execution. Designed to run on Vercel as a serverless function.

Features

File Management Tools

  • create_file - Create new files with content
  • read_file - Read file contents
  • update_file - Update existing files
  • delete_file - Delete files
  • list_files - List files and directories
  • create_directory - Create new directories

Code Execution

  • execute_python - Execute Python scripts with arguments (30-second timeout)

Setup on Vercel

1. Install Vercel CLI (Optional)

npm install -g vercel

2. Project Structure

Your project should look like this:

ai-workspace-mcp/
├── api/
│   └── mcp.py          # Serverless function
├── vercel.json         # Vercel configuration
├── requirements.txt    # Python dependencies
└── README.md          # This file

3. Deploy to Vercel

Option A: Deploy via Vercel Dashboard

  1. Go to vercel.com
  2. Click "Add New" → "Project"
  3. Import your Git repository (or upload files)
  4. Vercel will auto-detect Python and deploy

Option B: Deploy via CLI

# Login to Vercel
vercel login

# Deploy
vercel

# Deploy to production
vercel --prod

4. Get Your Deployment URL

After deployment, Vercel will give you a URL like: https://your-project-name.vercel.app

API Endpoints

Once deployed, your server will have these endpoints:

GET /

Returns server information and status

curl https://your-project.vercel.app/

GET /health

Health check endpoint

curl https://your-project.vercel.app/health

GET /tools

List all available tools

curl https://your-project.vercel.app/tools

POST /execute

Execute a tool

curl -X POST https://your-project.vercel.app/execute \
  -H "Content-Type: application/json" \
  -d '{
    "tool": "create_file",
    "arguments": {
      "filepath": "hello.py",
      "content": "print(\"Hello World!\")"
    }
  }'

Using with AI Clients

Claude Desktop Configuration

Add this to your Claude Desktop config:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "ai-workspace": {
      "command": "curl",
      "args": [
        "-X", "POST",
        "https://your-project.vercel.app/execute",
        "-H", "Content-Type: application/json",
        "-d", "@-"
      ]
    }
  }
}

Using the API Directly

You can integrate this with any AI that supports HTTP tool calling:

import requests

# Create a file
response = requests.post(
    "https://your-project.vercel.app/execute",
    json={
        "tool": "create_file",
        "arguments": {
            "filepath": "script.py",
            "content": "print('Hello from AI!')"
        }
    }
)
print(response.json())

# Execute the file
response = requests.post(
    "https://your-project.vercel.app/execute",
    json={
        "tool": "execute_python",
        "arguments": {
            "filepath": "script.py"
        }
    }
)
print(response.json())

Security Features

  • Sandboxed Workspace: All file operations are restricted to /tmp/workspace
  • Path Validation: Prevents directory traversal attacks
  • Execution Timeout: Python scripts are limited to 30 seconds
  • CORS Enabled: Allows cross-origin requests
  • Serverless Isolation: Each request runs in an isolated environment

Tool Examples

Create and Execute a Python Script

# Create a file
curl -X POST https://your-project.vercel.app/execute \
  -H "Content-Type: application/json" \
  -d '{
    "tool": "create_file",
    "arguments": {
      "filepath": "hello.py",
      "content": "print(\"Hello from Vercel!\")"
    }
  }'

# Execute it
curl -X POST https://your-project.vercel.app/execute \
  -H "Content-Type: application/json" \
  -d '{
    "tool": "execute_python",
    "arguments": {
      "filepath": "hello.py"
    }
  }'

List Files

curl -X POST https://your-project.vercel.app/execute \
  -H "Content-Type: application/json" \
  -d '{
    "tool": "list_files",
    "arguments": {}
  }'

Create Directory Structure

curl -X POST https://your-project.vercel.app/execute \
  -H "Content-Type: application/json" \
  -d '{
    "tool": "create_directory",
    "arguments": {
      "dirpath": "scripts"
    }
  }'

Response Format

All tool executions return JSON:

Success Response:

{
  "success": true,
  "message": "Successfully created file: hello.py\nSize: 26 bytes"
}

Error Response:

{
  "success": false,
  "error": "File not found: nonexistent.py"
}

Execute Python Response:

{
  "success": true,
  "exit_code": 0,
  "stdout": "Hello from Vercel!\n",
  "stderr": ""
}

Important Notes

Vercel Limitations

  • Temporary Storage: Files in /tmp are ephemeral and cleared between invocations
  • 10-second timeout: Vercel functions timeout after 10 seconds on free tier (25s on Pro)
  • Cold Starts: First request may be slower due to cold start
  • No Persistent State: Each function invocation starts fresh

For Persistent Storage

If you need persistent file storage, consider:

  1. Using Vercel KV, Postgres, or Blob storage
  2. Integrating with AWS S3, Google Cloud Storage, etc.
  3. Using a database to store file contents

Environment Variables (Optional)

You can set environment variables in Vercel Dashboard:

  • WORKSPACE_PATH - Custom workspace path (default: /tmp/workspace)
  • EXECUTION_TIMEOUT - Python execution timeout in seconds (default: 30)

Local Development

Test locally before deploying:

# Install dependencies
pip install -r requirements.txt

# Run with Python's built-in server
cd api
python -m http.server 8000

# Or use Vercel CLI
vercel dev

Then test with:

curl http://localhost:8000/health

Troubleshooting

"Module not found" errors

Ensure requirements.txt is in the project root and contains all dependencies.

Timeout errors

  • Reduce Python script complexity
  • Upgrade to Vercel Pro for longer timeouts
  • Use async operations where possible

File not persisting

Remember: /tmp storage is ephemeral on Vercel. Files won't persist between invocations.

Advanced Usage

Custom MCP Client

class VercelMCPClient:
    def __init__(self, base_url):
        self.base_url = base_url
    
    def call_tool(self, tool_name, arguments):
        response = requests.post(
            f"{self.base_url}/execute",
            json={"tool": tool_name, "arguments": arguments}
        )
        return response.json()
    
    def list_tools(self):
        response = requests.get(f"{self.base_url}/tools")
        return response.json()

# Usage
client = VercelMCPClient("https://your-project.vercel.app")
result = client.call_tool("create_file", {
    "filepath": "test.py",
    "content": "print('test')"
})

Contributing

Feel free to extend this server with additional tools:

  1. Add tool definition to get_tools()
  2. Implement handler in execute_tool()
  3. Update documentation

License

MIT License - modify and use as needed.

Support

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured