H3 CLI MCP Server
An MCP server that allows AI assistants and LLMs to interact with the Horizon3.ai API for scheduling pentests, querying results, and automating security workflows through natural language commands.
Tools
fetch_graphql_docs
Fetch GraphQL documentation for a given API element within the H3 GraphQL schema. This tool provides documentation about GraphQL types, queries, mutations, fields, and enums. Use it to explore the H3 GraphQL API and understand available queries and their parameters. Args: id (str): The API element ID to fetch documentation for. This can be: - A type name (e.g., "Query", "Mutation", "Weakness") - A field path (e.g., "Query.pentests_page", "Mutation.run_pentest") - An enum type (e.g., "AuthzRole", "PortalOpState") - An enum value (e.g., "AuthzRole.ORG_ADMIN", "PortalOpState.running") Returns: Dict with command output and status. The output field contains the documentation from the GraphQL server. The GraphQL type of the result is GQLAPIDoc. Examples: To explore all available queries: fetch_graphql_docs("Query") To get details about a specific query: fetch_graphql_docs("Query.pentests_page") To learn about a specific type: fetch_graphql_docs("Weakness") To explore available enum values: fetch_graphql_docs("PortalOpState") Tips: 1. Start with "Query" or "Mutation" to discover available operations 2. When you find a query of interest, get its detailed docs using "Query.<query_name>" 3. For any type mentioned in responses, get its details using the type name directly
run_graphql_request
Run a GraphQL request with the given query and variables. Args: graphql_query (str): The GraphQL query to execute. This should be a valid GraphQL query string. variables (str, optional): A JSON string containing variables for the GraphQL query. If provided, this must be a valid JSON string. Example (as a string): '{"pageInput": {"page_num": 1, "page_size": 5}, "op_id": "abc123"}' Example (for a query with variables): query weaknesses_page($pageInput: PageInput, $op_id: String!) { weaknesses_page(pageInput: $pageInput, op_id: $op_id) { weaknesses { id title severity } } } Pass variables as: '{"pageInput": {"page_num": 1, "page_size": 10}, "op_id": "abc123"}' Returns: Dict with output and status. The output field contains the GraphQL response. Notes: - If variables cannot be passed as a separate parameter due to MCP limitations, you can embed them directly in your query using variable definitions. - If the variables parameter is not a valid JSON string, a clear error message will be returned.
run_h3_command
Execute an H3 CLI command with optional arguments. This tool allows direct execution of any h3-cli command, providing flexible access to all H3 API capabilities from the command line interface. Args: command (str): The H3 command to execute without the 'h3' prefix. Common commands include 'whoami', 'pentests', 'pentest', 'hello-world', and 'help'. args (List[str], optional): A list of string arguments for the command. These will be passed directly to the command. Returns: Dict with command output and status. The output field contains the command's response, either as parsed JSON or raw text. Examples: Check the current user identity: run_h3_command("whoami") View a specific pentest by ID: run_h3_command("pentest", ["abc123"]) List all pentests with pagination: run_h3_command("pentests", ["--page-size", "10", "--page", "1"]) Get help for a specific command: run_h3_command("help", ["pentest"]) Run a new pentest using a template: run_h3_command("run-pentest", ["my-template-name"]) Notes: - To see all available commands, use run_h3_command("help") - For command-specific help, use run_h3_command("help", ["command_name"]) - Command execution is synchronous and will block until completion
health_check
Check the health of the MCP server and h3-cli installation. This tool verifies that: 1. The h3-cli tool is properly installed and in the system PATH 2. The H3 API connection is working (by running the 'hello-world' test) Use this tool to diagnose connectivity issues or confirm proper setup before running other operations. Args: None Returns: Dict containing: - status: "ok" if everything is working, "error" if there's a problem - details: A human-readable message describing the status - output: Raw output from the h3 hello-world command (if available) Examples: Basic health check: health_check() Expected successful response: { "status": "ok", "details": "h3-cli is installed and API is reachable.", "output": "{ "data": { "hello": "world!" } }" } Notes: - If the h3-cli tool is not installed, the status will be "error" - If the API key is invalid or there are connection issues, the status will be "error" - This tool is useful for troubleshooting MCP server configuration problems
README
H3 CLI MCP Server
An MCP server that lets AI assistants and LLMs interact with the Horizon3.ai API using the official h3-cli tool.
What is this?
This MCP server exposes the full power of the h3-cli to your AI coding assistant (Claude, Cursor, VS Code, etc). It enables:
- Scheduling and running pentests
- Querying pentest results, weaknesses, impacts, hosts, credentials, and more
- Automating security workflows and reporting
- All via natural language and LLM tools
Note: You must have a working h3-cli installed and authenticated on your system. This server is a thin wrapper and does not manage your API keys or CLI installation.
Quick Copy-Paste: Add to Your MCP Client
Add this to your MCP client configuration (e.g., Cursor, Claude Desktop, Windsurf, etc):
{
"mcpServers": {
"h3": {
"command": "uvx",
"args": ["horizon3ai/h3-cli-mcp"]
}
}
}
- No need to clone or build this repo manually—
uvxwill fetch and run the latest version automatically. - For advanced usage, see below.
Features
- Full h3-cli API access: Everything you can do with the CLI, you can do via LLM tools.
- GraphQL documentation: Fetch up-to-date docs for all available queries and mutations.
- Parameter validation: Clear error messages and examples for all tool inputs.
- Prompt templates: Built-in guidance for pagination, pivots, and common workflows.
- Works with any MCP-compatible client: Claude, Cursor, Windsurf, VS Code, and more.
Tools Provided
| Tool Name | Description |
|---|---|
run_h3_command |
Run any h3-cli command and return the output. |
fetch_graphql_docs |
Fetch GraphQL schema/docs for any query, mutation, or type. |
run_graphql_request |
Run a raw GraphQL query with variables and get the result. |
health_check |
Check h3-cli installation and API connectivity. |
See your client’s tool discovery UI for full parameter details and examples.
Usage with VS Code, Cursor, Claude Desktop, etc.
- VS Code: Add the above config to your
.vscode/mcp.jsonor User Settings (JSON). - Cursor: Add to
~/.cursor/mcp.jsonor your project’s.cursor/mcp.json. - Claude Desktop: Add to
claude_desktop_config.json. - Windsurf: Add to your Windsurf MCP config file.
For more details, see your client’s documentation on MCP server configuration.
Troubleshooting
- If you see errors about
h3not found, make sure you have installed and authenticatedh3-cli(see below). - If you see authentication errors, double-check your API key in the CLI.
- For more help, see the official h3-cli setup guide.
License
<details> <summary><strong>How to Install and Set Up h3-cli (Required)</strong></summary>
1. Install h3-cli
git clone https://github.com/horizon3ai/h3-cli
cd h3-cli
bash install.sh your-api-key-here
- Get your API key from the Horizon3.ai Portal under User → Account Settings.
- The install script will prompt you to update your shell profile. Follow the instructions, then restart your Terminal.
2. Test your h3-cli install
h3
You should see the h3-cli help text.
3. Verify your API connection
h3 hello-world
You should see a response like:
{ "data": { "hello": "world!" } }
</details>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.