
BinAssistMCP
Enables AI-assisted reverse engineering by bridging Binary Ninja with Large Language Models through 40+ analysis tools. Provides comprehensive binary analysis capabilities including decompilation, symbol management, type analysis, and documentation generation through natural language interactions.
README
BinAssistMCP
Comprehensive Model Context Protocol (MCP) server for Binary Ninja with AI-powered reverse engineering capabilities
Summary
BinAssistMCP is a powerful bridge between Binary Ninja and Large Language Models (LLMs) like Claude, providing comprehensive reverse engineering tools through the Model Context Protocol (MCP). It enables AI-assisted binary analysis by exposing Binary Ninja's advanced capabilities through both Server-Sent Events (SSE) and STDIO transports.
Key Features
- Dual Transport Support: Both SSE (web-based) and STDIO (command-line) transports
- 40+ Analysis Tools: Complete Binary Ninja API wrapper with advanced functionality
- Multi-Binary Sessions: Concurrent analysis of multiple binaries with intelligent context management
- Smart Symbol Management: Advanced function searching, renaming, and type management
- Auto-Integration: Seamless Binary Ninja plugin with automatic startup capabilities
- Flexible Configuration: Comprehensive settings management through Binary Ninja's interface
- AI-Ready: Optimized for LLM integration with structured tool responses
Use Cases
- AI-Assisted Reverse Engineering: Leverage LLMs for intelligent code analysis and documentation
- Automated Binary Analysis: Script complex analysis workflows with natural language
- Research and Education: Teach reverse engineering concepts with AI guidance
- Security Analysis: Accelerate vulnerability research and malware analysis
- Code Understanding: Generate comprehensive documentation and explanations
Tool Details
BinAssistMCP provides over 40 specialized tools organized into functional categories:
Binary Management
list_binaries
- List all loaded binary filesget_binary_status
- Check analysis status and metadataupdate_analysis_and_wait
- Force analysis update and wait for completion
Code Analysis & Decompilation
decompile_function
- Generate high-level decompiled codeget_function_pseudo_c
- Extract pseudo-C representationget_function_high_level_il
- Access High-Level Intermediate Languageget_function_medium_level_il
- Access Medium-Level Intermediate Languageget_disassembly
- Retrieve assembly code with annotations
Information Retrieval
get_functions
- List all functions with metadatasearch_functions_by_name
- Find functions by name patternsget_functions_advanced
- Advanced filtering (size, complexity, parameters)search_functions_advanced
- Multi-target searching (name, comments, calls, variables)get_function_statistics
- Comprehensive binary statisticsget_imports
- Import table analysis grouped by moduleget_exports
- Export table with symbol informationget_strings
- String extraction with contextget_segments
- Memory layout analysisget_sections
- Binary section information
Symbol & Naming Management
rename_symbol
- Rename functions and data variablesget_cross_references
- Find all references to/from symbolsanalyze_function
- Comprehensive function analysisget_call_graph
- Call relationship mapping
Documentation & Comments
set_comment
- Add comments to specific addressesget_comment
- Retrieve comments at addressesget_all_comments
- Export all comments with contextremove_comment
- Delete existing commentsset_function_comment
- Add function-level documentation
Variable Management
create_variable
- Define local variables in functionsget_variables
- List function parameters and localsrename_variable
- Rename variables for clarityset_variable_type
- Update variable type information
Type System Management
create_type
- Define custom types and structuresget_types
- List all user-defined typescreate_enum
- Create enumeration typescreate_typedef
- Create type aliasesget_type_info
- Detailed type informationget_classes
- List classes and structurescreate_class
- Define new classes/structuresadd_class_member
- Add members to existing types
Data Analysis
create_data_var
- Define data variables at addressesget_data_vars
- List all defined data variablesget_data_at_address
- Analyze raw data with type inference
Navigation & Context
get_current_address
- Get current cursor positionget_current_function
- Identify function at current addressget_namespaces
- Namespace and symbol organization
Advanced Analysis
get_triage_summary
- Complete binary overviewget_function_statistics
- Statistical analysis of all functions
Each tool is designed for seamless integration with AI workflows, providing structured responses that LLMs can easily interpret and act upon.
Installation
Prerequisites
- Binary Ninja: Version 4000 or higher
- Python: 3.8+ (typically bundled with Binary Ninja)
- Platform: Windows, macOS, or Linux
Option 1: Binary Ninja Plugin Manager (Recommended)
- Open Binary Ninja
- Navigate to Tools → Manage Plugins
- Search for "BinAssistMCP"
- Click Install
- Restart Binary Ninja
Option 2: Manual Installation
Step 1: Download and Extract
git clone https://github.com/jtang613/BinAssistMCP.git
cd BinAssistMCP
Step 2: Install Dependencies
# Install Python dependencies
pip install -r requirements.txt
# Or install individually:
pip install anyio>=4.0.0 hypercorn>=0.16.0 mcp>=1.0.0 trio>=0.27.0 pydantic>=2.0.0 pydantic-settings>=2.0.0 click>=8.0.0
Step 3: Copy to Plugin Directory
Windows:
copy BinAssistMCP "%APPDATA%\Binary Ninja\plugins\"
macOS:
cp -r BinAssistMCP ~/Library/Application\ Support/Binary\ Ninja/plugins/
Linux:
cp -r BinAssistMCP ~/.binaryninja/plugins/
Step 4: Verify Installation
- Restart Binary Ninja
- Open any binary file
- Check Tools menu for "BinAssistMCP" submenu
- Look for startup messages in the log panel
Configuration
Basic Setup
- Open Binary Ninja Settings (Edit → Preferences)
- Navigate to the binassistmcp section
- Configure server settings:
- Host:
localhost
(default) - Port:
9090
(default) - Transport:
both
(SSE + STDIO)
- Host:
Advanced Configuration
# Environment variables (optional)
export BINASSISTMCP_SERVER__HOST=localhost
export BINASSISTMCP_SERVER__PORT=9090
export BINASSISTMCP_SERVER__TRANSPORT=both
export BINASSISTMCP_BINARY__MAX_BINARIES=10
Usage
Starting the Server
Via Binary Ninja Menu:
- Tools → BinAssistMCP → Start Server
- Check log panel for startup confirmation
- Note the server URL (e.g.,
http://localhost:9090
)
Auto-Startup (Default):
- Server starts automatically when Binary Ninja loads a file
- Configurable via settings:
binassistmcp.plugin.auto_startup
Connecting with Claude Desktop
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"binassist": {
"command": "python",
"args": ["/path/to/BinAssistMCP"],
"env": {
"BINASSISTMCP_SERVER__TRANSPORT": "stdio"
}
}
}
}
Using with SSE Transport
Connect web-based MCP clients to:
http://localhost:9090/sse
Integration Examples
Basic Function Analysis
Ask Claude: "Analyze the main function in the loaded binary and explain what it does"
Claude will use tools like:
- get_functions() to find main
- decompile_function() to get readable code
- get_function_pseudo_c() for C representation
- analyze_function() for comprehensive analysis
Vulnerability Research
Ask Claude: "Find all functions that handle user input and check for buffer overflows"
Claude will use:
- search_functions_advanced() to find input handlers
- get_cross_references() to trace data flow
- get_variables() to analyze buffer usage
- set_comment() to document findings
Troubleshooting
Common Issues
Server won't start:
- Check Binary Ninja log panel for error messages
- Verify all dependencies are installed
- Ensure port 9090 is not in use
Binary Ninja crashes:
- Check Python environment compatibility
- Try reducing
max_binaries
setting - Restart with a single binary file
Tools return errors:
- Ensure binary analysis is complete
- Check if Binary Ninja file is still open
- Verify function/address exists
Support
- Issues: Report bugs on GitHub Issues
- Binary Ninja: Check official Binary Ninja documentation
Contributing
- Fork the repository
- Create a feature branch
- Make your changes following the existing code style
- Test with multiple binary types
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.