
Jetson MCP Server
A Model Context Protocol server that enables monitoring and remote control of Nvidia Jetson boards using natural language commands over a network connection.
README
jetson-mcp
A MCP (Model Context Protocol) server for using natural language to monitor and remotely control a Nvidia Jetson board from clients on the same network.
<p align="center"> <img src="resources/nvidia_jetson.jpg" alt="Nvidia Jetson"> </p>
This project uses the FastMCP library to create the server.
MCP Server Connected:
<p align="center"> <img src="resources/mcp_server.png" alt="MCP Server Connected"> </p>
Features
- Provides MCP tools accessible by network clients using the SSE (Server-Sent Events) transport.
get_jetson_hw_info
: Reads/etc/nv_boot_control.conf
to identify module/carrier board info.get_jetson_sw_info
: Reads/etc/nv_tegra_release
(for Jetpack version) and/proc/version
(for Linux kernel version).- Includes scripts for easy installation and systemd service setup.
Setup and Installation (on the Jetson)
- Clone the repository:
git clone https://github.com/Zalmotek/jetson-mcp cd jetson-mcp
- Run the installation script:
This script creates a Python virtual environment (
venv/
) and installs dependencies fromrequirements.txt
.chmod +x install.sh ./install.sh
Running the Server (on the Jetson)
The recommended way to run the server is as a background service managed by systemd.
-
(Optional) Find Jetson IP/Hostname: You'll need the Jetson's IP address or hostname to connect from other devices. Use commands like
ip addr
orhostname -I
. -
Run the service setup script: This script creates and enables a systemd service file (
/etc/systemd/system/jetson-mcp.service
) configured to run the server as the user who invoked the script, listening on port 8000.chmod +x setup_service.sh sudo ./setup_service.sh
-
Start the service:
sudo systemctl start jetson-mcp.service
-
Verify Service:
sudo systemctl status jetson-mcp.service # Check logs for errors sudo journalctl -u jetson-mcp.service -f
-
Firewall: Ensure your Jetson's firewall (if active, e.g.,
ufw
) allows incoming connections on port 8000 (or your chosen port). Example forufw
:sudo ufw allow 8000/tcp
Running Manually (for testing)
The server is best run directly using the Python interpreter, which will invoke the mcp.run()
method configured within the script:
source venv/bin/activate
# The script itself now calls mcp.run() with SSE, host, and port settings
python app/main.py
Connecting from a Remote Client
Once the server is running on the Jetson and accessible on the network (port 8000 allowed through firewall):
- Identify the Server Address: Find the Jetson's IP address (e.g.,
192.168.1.105
) or its hostname (e.g.,jetson-nano.local
) on your LAN. - Configure Your Client: In your MCP client application (which could be a custom script, a UI like MCP Inspector, or potentially Cursor/Claude if they support network endpoints), configure it to connect to the MCP server at its network address.
- The specific connection method depends on the client, but it will likely involve specifying a URL for the SSE endpoint:
http://<jetson_ip_or_hostname>:8000/sse
(Common pattern for SSE)
- The specific connection method depends on the client, but it will likely involve specifying a URL for the SSE endpoint:
Note: Cursor's mcp.json
file is primarily designed for launching local servers via stdio
transport. Connecting Cursor to this networked SSE server might require different configuration steps or might not be directly supported without a proxy. Consult your specific client's documentation for how to connect to a network MCP SSE endpoint.
Examples / Screenshots
Hardware Info Tool:
<p align="center"> <img src="resources/get_hardware_info.png" alt="Hardware Info Tool"> </p>
Software Info Tool:
<p align="center"> <img src="resources/get_software_info.png" alt="Software Info Tool"> </p>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.