MCP-CAN
Enables LLMs to interact with vehicle CAN bus and OBD-II data through a simulated ECU environment. Provides tools for reading frames, decoding messages via DBC files, monitoring signals, and querying automotive diagnostics without requiring physical hardware.
README
MCP-CAN: Virtual CAN + MCP Server
An MCP server purpose-built to surface vehicle CAN/OBD data to an LLM/SLM. It simulates ECUs on a virtual CAN bus, decodes via a DBC, and exposes MCP tools over SSE—no hardware required by default.
Highlights
- MCP server for CAN/OBD → LLM/SLM (tools + DBC metadata over SSE).
- Virtual CAN backend (python-can) out of the box; optional SocketCAN/vCAN on Linux.
- DBC-driven encoding/decoding via
cantools. - ECU simulator that streams multiple messages plus demo OBD-II responses.
- MCP server (SSE) exposing tools for frames, filtering, monitoring, and DBC info.
- Typer CLI:
mcp-can(simulate, server, frames, decode, monitor, obd-request). - Dockerfile + docker compose for server + simulator.
- Unit tests, type hints, lint config (ruff, mypy).
Repository Layout
src/mcp_can/cli.py– Typer commandsbus.py– python-can helpersdbc.py– DBC loading/decodingconfig.py– env settings (MCP_CAN_*)models.py– simple dataclassessimulator/runner.py– ECU simulator + OBD responderserver/fastmcp_server.py– MCP tools (SSE)obd.py– minimal OBD-II request/response helpers
vehicle.dbc– sample CAN databasesimulate-ecus.py,can-mcp.py– entrypointsdocker/compose.yml,Dockerfiletests/– unit tests
Prerequisites
- Python 3.10+
- (Optional) Docker / Docker Compose
- (Optional) Ollama if you want a local LLM backend
Install (Python)
From repo root:
pip install -r requirements.txt
pip install -e .
Quickstart (Simulator + MCP Server)
Two terminals:
# Terminal A: start ECU simulator on virtual bus0
mcp-can simulate
# Terminal B: start MCP server (SSE on 6278)
mcp-can server --port 6278
Single-process (helps on Windows if virtual backend doesn’t share across processes):
mcp-can demo --port 6278
Sample interactions:
mcp-can frames --seconds 2
mcp-can decode --id 0x100 --data "01 02 03 04 05 06 07 08"
mcp-can monitor --signal ENGINE_SPEED --seconds 3
mcp-can obd-request --service 0x01 --pid 0x0D
MCP Inspector (GUI for your tools)
Use the official Inspector to explore and call your MCP tools without writing a host:
npx @modelcontextprotocol/inspector
When prompted, connect to your server:
- URL:
http://localhost:6278/sse
You can then:
- List tools and resources (
read_can_frames,decode_can_frame,filter_frames,monitor_signal,dbc_info). - Call a tool (e.g., monitor
ENGINE_SPEEDfor 5 seconds) and view JSON output live.
Using with Ollama (local LLM)
- Ensure Ollama is running:
ollama serveand pull a model:ollama pull llama3 - Run simulator + MCP server (see Quickstart).
- Point your MCP-capable host at
http://localhost:6278/sseand configure its model endpoint tohttp://localhost:11434with your model name (e.g.,llama3). - Prompt the host: “Monitor ENGINE_SPEED for 5 seconds” or “List all DBC messages.”
If you need a minimal host, pair @modelcontextprotocol/sdk with Ollama (see SDK docs) or use Inspector for manual tool calls.
Example host config (OpenAI-compatible endpoint to local Ollama):
{
"model": {
"type": "openai-compatible",
"baseUrl": "http://localhost:11434/v1",
"model": "llama3"
},
"mcpServers": {
"can-mcp-server": {
"serverUrl": "http://localhost:6278/sse"
}
}
}
CLI Reference
mcp-can simulate– start ECU simulator usingvehicle.dbc.mcp-can server [--port 6278]– run MCP SSE server.mcp-can frames --seconds 1.0– capture raw frames as JSON.mcp-can decode --id <hex|int> --data <bytes>– decode a single frame.mcp-can monitor --signal <NAME> --seconds 2.0– watch one signal.mcp-can obd-request --service <hex|int> [--pid <hex|int>]– demo OBD-II request.
Configuration
Env vars (prefix MCP_CAN_):
CAN_INTERFACE(defaultvirtual)CAN_CHANNEL(defaultbus0)DBC_PATH(defaultvehicle.dbc)MCP_PORT(default6278)
You can set these in a .env file at repo root.
Docker
Build:
docker build -t mcp-can .
Run (combined server + simulator):
docker run -d --name mcp-can -p 6278:6278 -p 5000:5000 -p 8080:8080 mcp-can
Compose (from docker/):
docker compose up -d --build
Development & Testing
pip install -r requirements.txt
pip install -e .
pip install pytest ruff mypy
ruff check .
mypy src
pytest -q
Troubleshooting
- No frames? Ensure both simulator and server use the same interface/channel (
virtual/bus0by default). - DBC missing? Set
MCP_CAN_DBC_PATHor placevehicle.dbcin repo root. - Docker networking: expose
6278so your MCP host can reach SSE.
License
MIT (see LICENSE). Educational/prototyping use only—use certified hardware for real automotive work.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
E2B
Using MCP to run code via e2b.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Neon Database
MCP server for interacting with Neon Management API and databases