Windows MCP Server
Exposes Windows system information and control tools including hardware stats, network details, and process monitoring. It enables AI applications to retrieve real-time data about CPU, memory, drives, and active system processes.
README
Windows MCP Server
This project provides a Windows MCP (Model Context Protocol) server exposing useful system information and control tools for Windows environments on your AI applications.
What’s new
- More robust drive and uptime implementations (no brittle PowerShell parsing).
- Structured results for top processes (list of objects with pid, name, cpu_percent, memoryMB).
- Structured results for memory and network information.
- Safer PowerShell usage with JSON parsing for GPU info.
Features
- System info (OS, release, version, architecture, hostname)
- Uptime and last boot time
- Drives listing and per-drive space usage
- Memory, CPU, GPU, and Network information
- Top processes by memory and CPU (accurate sampling)
Available Tools
| Tool Name | Description | Parameters | Returns |
|---|---|---|---|
| Windows-system-info | Get OS, release, version, architecture, and hostname | None | object: name, system, release, version, architecture, hostname |
| Windows-last-boot-time | Get the last boot time of the system | None | string (timestamp) |
| Windows-uptime | Get system uptime since last boot | None | string: "Uptime: <seconds> seconds" |
| Windows-drives | Get list of all available drives | None | string[] (e.g., ["C", "D"]) |
| Windows-drive-status | Get used and free space for a specific drive | drive: string | DriveInfo { name, used_spaceGB: number, free_spaceGB: number } |
| Windows-drives-status-simple | Get status using comma-separated drive letters | drives_string: string | DriveInfo[] |
| Windows-memory-info | Get RAM usage information | None | object: total_memory, available_memory, used_memory (strings with GB) |
| Windows-network-info | Get network IPv4 addresses per interface | None | object: interface -> IPv4 (or { error }) |
| Windows-cpu-info | Get CPU model, logical count and frequency | None | string |
| Windows-gpu-info | Get GPU name(s) and driver versions | None | string (one line per GPU) |
| Windows-top-processes-by-memory | Get the top X processes by memory usage | amount: int = 5 | ProcessInfo[] { pid, name, memoryMB, cpu_percent? } |
| Windows-top-processes-by-cpu | Get the top X processes by CPU usage (sampled for accuracy) | amount: int = 5 | ProcessInfo[] { pid, name, cpu_percent, memoryMB } |
Note: Previously documented tools Windows-name-version, Windows-drives-status, and Windows-all-drives-status are not currently implemented to avoid duplication. If needed, they can be added easily.
Requirements
- Python 3.13+
- uv (for fast startup and dependency management)
# On Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
Installation
- Clone this repository:
git clone https://github.com/carlosedp/windows-mcp-server.git
cd windows-mcp-server
Running the development Server
uv run mcp dev main.py
uv handles the installation of dependencies and runs the server with the MCP protocol enabled.
Then in the MCP Inspector browser window:
- Click "Connect" to connect the MCP client.
- Go to the "Tools" tab to see available tools.
- Click "List Tools" to see the available tools.
- Select a tool and click "Run tool" to execute it.
Examples
Windows-system-info
{
"name": "MY-PC",
"system": "Windows",
"release": "10",
"version": "10.0.19045",
"architecture": "64bit",
"hostname": "MY-PC"
}
Windows-drive-status (input: "C")
{
"name": "C",
"used_spaceGB": 120.53,
"free_spaceGB": 380.12
}
Windows-memory-info
{
"total_memory": "32.00 GB",
"available_memory": "18.25 GB",
"used_memory": "13.75 GB"
}
Windows-network-info
{
"Ethernet": "192.168.1.50",
"Wi-Fi": "10.0.0.15"
}
Windows-top-processes-by-cpu (amount: 3)
[
{ "pid": 1234, "name": "chrome.exe", "cpu_percent": 24.7, "memoryMB": 512.3 },
{ "pid": 4321, "name": "code.exe", "cpu_percent": 12.1, "memoryMB": 650.8 },
{ "pid": 9876, "name": "System", "cpu_percent": 8.4, "memoryMB": 45.0 }
]
MCP Client Configuration Example
LM Studio using Llama 3.2 3B getting system information using the MCP server:

Some more examples of tools you can run:

To connect your MCP client to this server (like Claude Desktop, VSCode, LM Studio, etc), add the following to your client configuration:
{
"mcpServers": {
"windows-mcp": {
"command": "uv",
"args": [
"--directory",
"C:\\Users\\Carlos Eduardo\\repos\\windows-mcp-server",
"run",
"mcp",
"run",
"main.py"
]
}
}
}
Adjust the paths above to where the project file is located.
Packaging and Distribution
To publish as a Python package:
- Edit
pyproject.tomlwith your metadata. - Build and upload to PyPI:
python -m build
python -m twine upload dist/*
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.