AI Assistant Chat with Nmap Tool Integration
An example MCP server with a couple nmap scans as tools.
jarrodcoulter
README
AI Assistant Chat with Nmap Tool Integration
This project provides a web-based chat interface using Gradio where users can interact with an AI assistant powered by the OpenAI API. The assistant is equipped with tools to interact with the local filesystem and perform network scans using a containerized Nmap server.
Overview
The application uses the OpenAI Agents SDK framework. User requests are processed by an AI agent that can reason about the request and decide whether to use available tools. It features:
- A Gradio frontend for easy interaction.
- An AI agent backend leveraging an OpenAI model (requires API key).
- A Model Context Protocol (MCP) server for filesystem access (using
@modelcontextprotocol/server-filesystem
). - A containerized MCP server providing Nmap scanning capabilities (ping, port scans, service discovery, SMB share enumeration)[cite: 14, 16, 18, 20, 22].
The Nmap server runs inside a Docker container for easy dependency management and isolation.
Features
- Conversational AI assistant.
- Filesystem access tool (scoped to the application directory).
- Network scanning tools via Nmap:
ping_host
[cite: 14]scan_network
(Top 100 ports) [cite: 16]all_scan_network
(-A comprehensive scan) [cite: 18]all_ports_scan_network
(All 65535 ports) [cite: 20]smb_share_enum_scan
(SMB Share Enumeration) [cite: 22]
- Web-based UI using Gradio[cite: 11, 12].
- Containerized Nmap tool server using Docker.
Architecture
- Gradio UI (
app.txt
): Handles user input and displays conversation history. - Main Application (
app.txt
):- Initializes Gradio interface.
- Manages conversation state.
- Sets up and manages MCP servers.
- Instantiates and runs the OpenAI Agent.
- OpenAI Agent (
agents
library): Processes user messages, calls tools when needed, and generates responses[cite: 1, 3]. - MCP Servers:
- Filesystem Server: Runs via
npx
to provide local file access[cite: 1]. - Nmap Toolkit Server (
nmap-server.txt
in Docker): Runs inside a Docker container, exposing Nmap scan functions as tools via MCP[cite: 2, 14].app.txt
usesdocker run
to start this server for each request.
- Filesystem Server: Runs via
Prerequisites
- Python: 3.9+
- Docker: Latest version installed and running.
- Node.js/npm: Required for
npx
to run the filesystem MCP server. - OpenAI API Key: Set as an environment variable
OPENAI_API_KEY
.
Installation & Setup
-
Clone the repository:
git clone <your-repository-url> cd <your-repository-directory>
-
Set OpenAI API Key: Export your API key as an environment variable. Replace
your_api_key_here
with your actual key.- Linux/macOS:
export OPENAI_API_KEY='your_api_key_here'
- Windows (Command Prompt):
set OPENAI_API_KEY=your_api_key_here
- Windows (PowerShell):
$env:OPENAI_API_KEY='your_api_key_here'
- Linux/macOS:
-
Build the Nmap Docker Image: Navigate to the directory containing
nmap-server.py
andDockerfile
, then run:docker build -t nmap-mcp-server .
(Ensure the
Dockerfile
content is correct, especially the MCP package name if it's notmodelcontextprotocol
) -
Install Python Dependencies: It's recommended to use a virtual environment.
python -m venv venv # Activate the virtual environment # Linux/macOS: source venv/bin/activate # Windows: .\venv\Scripts\activate # Install requirements pip install -r requirements.txt
Running the Application
Ensure your OpenAI API key is set, Docker is running, and you are in the project's root directory with the virtual environment activated.
python app.py
An example MCP server with a couple nmap scans as tools.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.