Claude-LMStudio Bridge
An MCP server that allows Claude to interact with local LLMs running in LM Studio, providing access to list models, generate text, and use chat completions through local models.
infinitimeless
README
Claude-LMStudio Bridge
An MCP server that bridges Claude with local LLMs running in LM Studio.
Overview
This tool allows Claude to interact with your local LLMs running in LM Studio, providing:
- Access to list all available models in LM Studio
- The ability to generate text using your local LLMs
- Support for chat completions through your local models
- A health check tool to verify connectivity with LM Studio
Prerequisites
- Claude Desktop with MCP support
- LM Studio installed and running locally with API server enabled
- Python 3.8+ installed
Quick Start (Recommended)
For macOS/Linux:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
chmod +x setup.sh
./setup.sh
- Follow the setup script's instructions to configure Claude Desktop
For Windows:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
setup.bat
- Follow the setup script's instructions to configure Claude Desktop
Manual Setup
If you prefer to set things up manually:
- Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install the required packages
pip install -r requirements.txt
- Configure Claude Desktop:
- Open Claude Desktop preferences
- Navigate to the 'MCP Servers' section
- Add a new MCP server with the following configuration:
- Name: lmstudio-bridge
- Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
- Arguments:
- macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
- Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat
Usage with Claude
After setting up the bridge, you can use the following commands in Claude:
- Check the connection to LM Studio:
Can you check if my LM Studio server is running?
- List available models:
List the available models in my local LM Studio
- Generate text with a local model:
Generate a short poem about spring using my local LLM
- Send a chat completion:
Ask my local LLM: "What are the main features of transformers in machine learning?"
Troubleshooting
Diagnosing LM Studio Connection Issues
Use the included debugging tool to check your LM Studio connection:
python debug_lmstudio.py
For more detailed tests:
python debug_lmstudio.py --test-chat --verbose
Common Issues
"Cannot connect to LM Studio API"
- Make sure LM Studio is running
- Verify the API server is enabled in LM Studio (Settings > API Server)
- Check that the port (default: 1234) matches what's in your .env file
"No models are loaded"
- Open LM Studio and load a model
- Verify the model is running successfully
"MCP package not found"
- Try reinstalling:
pip install "mcp[cli]" httpx python-dotenv
- Make sure you're using Python 3.8 or later
"Claude can't find the bridge"
- Check Claude Desktop configuration
- Make sure the path to run_server.sh or run_server.bat is correct and absolute
- Verify the server script is executable:
chmod +x run_server.sh
(on macOS/Linux)
Advanced Configuration
You can customize the bridge behavior by creating a .env
file with these settings:
LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false
Set DEBUG=true
to enable verbose logging for troubleshooting.
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.