PCILeech MCP Server

PCILeech MCP Server

Enables AI assistants to perform DMA-based memory operations through PCILeech hardware using natural language commands, supporting memory reading, writing, and multi-format visualization for debugging and security research.

Category
Visit Server

README

MCP Server for PCILeech

English | 中文

English

A Model Context Protocol (MCP) server that provides a standardized interface to PCILeech for DMA-based memory operations. This server enables AI assistants like Claude to perform memory debugging through natural language commands.

Authors: EVAN & MOER Support: Join our Discord

Features

  • Three MCP Tools:

    • memory_read: Read memory from any address
    • memory_write: Write data to memory
    • memory_format: Multi-view memory formatting (hex dump, ASCII, byte array, DWORD)
  • Low Latency: Direct subprocess calls to PCILeech binary

  • AI-Friendly: Natural language interface through MCP protocol

  • Simple Configuration: Minimal dependencies, easy setup

  • Multiple Formats: View memory in hex, ASCII, byte arrays, and DWORD arrays

Prerequisites

  • Windows 10/11 (x64)
  • Python 3.10+
  • PCILeech hardware properly configured and working
  • PCILeech binary (included in pcileech/ directory)

Quick Start

1. Clone Repository

git clone https://github.com/Evan7198/mcp_server_pcileech
cd mcp_server_pcileech

2. Install Dependencies

Create and activate virtual environment:

python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt

3. Verify PCILeech

Test that PCILeech hardware is working:

cd pcileech
pcileech.exe probe

You should see hardware detection output.

4. Configure for Claude Code

Add this configuration to your Claude Code MCP settings:

"mcpServers": {
  "pcileech": {
    "command": "C:\\path\\to\\mcp_server_pcileech\\.venv\\Scripts\\python.exe",
    "args": [
      "C:\\path\\to\\mcp_server_pcileech\\main.py"
    ],
    "cwd": "C:\\path\\to\\mcp_server_pcileech",
    "env": {}
  }
}

Important: Replace C:\\path\\to\\mcp_server_pcileech with your actual project path.

5. Restart Claude Code

After adding the configuration, restart Claude Code to load the MCP server.

Configuration

The server uses config.json for configuration:

{
  "pcileech": {
    "executable_path": "pcileech\\pcileech.exe",
    "timeout_seconds": 30
  },
  "server": {
    "name": "mcp-server-pcileech",
    "version": "0.1.0"
  }
}

Adjust executable_path and timeout_seconds if needed for your setup.

Usage Examples

Once configured in Claude Code, you can use natural language commands:

Reading Memory

Read 256 bytes from address 0x1000

Writing Memory

Write the hex data 48656c6c6f to address 0x2000

Formatted Memory View

Show me a formatted view of 64 bytes at address 0x1000

This will display:

  • Hex dump with ASCII sidebar
  • Pure ASCII view
  • Byte array (decimal)
  • DWORD array (little-endian)
  • Raw hex string

MCP Tools Reference

memory_read

Read raw memory from specified address.

Parameters:

  • address (string): Memory address in hex format (e.g., "0x1000" or "1000")
  • length (integer): Number of bytes to read (1-1048576, max 1MB)

Returns: Hex string of memory data with metadata

memory_write

Write data to memory at specified address.

Parameters:

  • address (string): Memory address in hex format
  • data (string): Hex string of data to write (e.g., "48656c6c6f")

Returns: Success status with confirmation

memory_format

Read memory and format in multiple views for AI analysis.

Parameters:

  • address (string): Memory address in hex format
  • length (integer): Number of bytes to read (1-4096, max 4KB)
  • formats (array, optional): Format types to include - ["hexdump", "ascii", "bytes", "dwords", "raw"]

Returns: Multi-format memory view

Architecture

Two-Layer Design

  1. MCP Server Layer (main.py)

    • Handles MCP protocol communication via stdio transport
    • Defines tool schemas and parameter validation
    • Formats output for AI analysis
    • Async tool handlers: handle_memory_read, handle_memory_write, handle_memory_format
  2. PCILeech Wrapper Layer (pcileech_wrapper.py)

    • Manages PCILeech executable subprocess calls
    • Handles address alignment and chunked reads (256-byte blocks, 16-byte alignment)
    • Parses PCILeech output format
    • Timeout and error handling

Key Implementation Details

Memory Read Alignment:

  • PCILeech display command always returns 256 bytes aligned to 16-byte boundaries
  • read_memory() automatically handles:
    • Calculating aligned addresses
    • Chunked reading of 256-byte blocks
    • Extracting and concatenating requested byte ranges
    • Supporting arbitrary addresses and lengths

Troubleshooting

PCILeech Not Found

Error: PCILeech executable not found

Solution: Verify the path in config.json points to the correct location of pcileech.exe

Hardware Not Connected

Warning: PCILeech connection verification failed

Solution:

  • Ensure PCILeech hardware is properly connected
  • Test with pcileech.exe probe directly
  • Check hardware drivers are installed

Memory Read/Write Fails

Error: Memory read/write failed

Possible causes:

  • Invalid memory address
  • Hardware access denied
  • Target system not accessible
  • Insufficient permissions

Solution: Verify the target address is valid and accessible by testing with PCILeech CLI first.

Timeout Errors

Error: PCILeech command timed out

Solution: Increase timeout_seconds in config.json if operations are legitimately slow.

Project Structure

mcp_server_pcileech/
├── main.py                 # MCP server entry point
├── pcileech_wrapper.py     # PCILeech integration layer
├── config.json             # Configuration file
├── requirements.txt        # Python dependencies
├── pyproject.toml          # Project metadata
├── README.md               # This file (English)
├── README_CN.md            # Chinese version
├── CLAUDE.md               # Claude Code guidance
├── docs/
│   └── brief.md            # Project brief
└── pcileech/               # PCILeech binary and dependencies
    └── pcileech.exe

Development

Code Formatting

black main.py pcileech_wrapper.py

Type Checking

mypy main.py pcileech_wrapper.py

Running Tests

pytest

Performance

  • MCP Server overhead: < 100ms per operation
  • PCILeech native performance: Maintained (no additional overhead)
  • End-to-end latency: < 5 seconds (including AI processing)

Limitations

  • Windows only: PCILeech is Windows-specific
  • Hardware dependent: Requires PCILeech hardware connection
  • Read size limits:
    • memory_read: Maximum 1MB
    • memory_format: Maximum 4KB (for readable output)
  • Synchronous PCILeech calls: Wrapper uses subprocess.run (blocking), called in async context
  • No concurrent memory operations: Each PCILeech command executes sequentially

Security & Legal

IMPORTANT DISCLAIMER

This tool is designed for:

  • Authorized hardware debugging
  • Security research with proper authorization
  • Educational purposes
  • Personal hardware development

DO NOT use for:

  • Unauthorized access to systems
  • Malicious activities
  • Circumventing security measures without permission

Users are responsible for ensuring their use complies with all applicable laws and regulations.

License

This project wraps PCILeech which has its own license. See pcileech/LICENSE.txt for PCILeech licensing.

Credits

Version

v0.1.0 - Initial MVP Release

Support

Changelog

v0.1.0 (2025-12-10)

  • Initial release
  • Three MCP tools: memory_read, memory_write, memory_format
  • PCILeech subprocess integration
  • Basic error handling and timeout support
  • Claude Code integration support
  • Multi-format memory visualization

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured