Codex LSP Bridge
Exposes Language Server Protocol (LSP) features as MCP tools, enabling IDE-grade semantic navigation including go-to-definition, find references, hover info, and symbols across multiple programming languages (Python, Rust, C/C++, TypeScript/JavaScript, React, HTML, CSS).
README
Codex LSP Bridge (MCP)
Give Codex CLI IDE-grade semantic navigation by exposing Language Server Protocol (LSP) features as MCP tools.
What you get
- Go to definition / type definition / find references
- Hover / signature help / call hierarchy
- Document symbols / workspace symbols
- Code actions (autofix/refactors) / format document
- Works across Python, Rust, C/C++, TypeScript/JavaScript/Node, React (JSX/TSX), HTML, CSS/SCSS/Less
- One endpoint:
http://127.0.0.1:8000/mcp(Streamable HTTP) or stdio
Quickstart
1) Install this bridge
# from the repo root
pip install -e .
2) Install language servers
On Debian/Ubuntu you can run ./install.sh to install all supported language servers.
Python (pick one)
pip install basedpyright # provides basedpyright-langserver
# OR
npm i -g pyright # provides pyright-langserver
Rust
rustup component add rust-analyzer
C/C++
# install clangd via your platform (llvm package)
clangd --version
TypeScript/JavaScript/Node/React
npm i -g typescript typescript-language-server
HTML/CSS
npm i -g vscode-langservers-extracted
3) Run the server
codex-lsp-bridge serve --config ./config/default.toml --host 127.0.0.1 --port 8000
4) Connect Codex CLI
Edit ~/.codex/config.toml (HTTP):
experimental_use_rmcp_client = true
[mcp_servers.lsp_bridge]
url = "http://127.0.0.1:8000/mcp"
Or use the Codex CLI helper:
codex mcp add lsp_bridge --url http://127.0.0.1:8000/mcp
Using it effectively
Prompt pattern:
When navigating or refactoring code, always use LSP tools instead of text search.
Example:
codex "Find all references to UserRepository and rename it to AccountRepository. Use LSP rename_symbol first."
Note: line/column positions are 0-indexed.
Tip: if line/column might be off, pass fuzzy=true (and optionally fuzzy_radius) to position-based tools.
LSP tool coverage
Core tools:
go_to_definition,type_definitionfind_references,rename_symbolhover,signature_helpdocument_symbols,workspace_symbolscode_action,format_documentcall_hierarchy,diagnostics,wait_for_diagnostics,lsp_bridge_statusresolve_symbol(fuzzy name-based lookup)
Smoke test (no file changes)
You can run a quick LSP smoke test against any file:
python scripts/lsp_smoke_test.py --file /abs/path/to/file.py --line 10 --column 5
Add --call-hierarchy if you want to test call hierarchy responses.
Install language servers (Debian/Ubuntu)
If you are on Debian/Ubuntu, you can use the provided script:
./install.sh
This installs:
- basedpyright (Python)
- ruff + ruff-lsp (Python lint/autofix)
- rust-analyzer (Rust)
- clangd (C/C++)
- typescript-language-server + typescript (TS/JS/React)
- vscode-html-language-server + vscode-css-language-server (HTML/CSS)
- eslint (JS/TS lint)
Note: some commands may land in ~/.local/bin or ~/.cargo/bin. Ensure those are on PATH for the user that runs the bridge.
Config
See config/default.toml. It maps file extensions to language server commands.
For Python, config/default.toml also enables stronger type checking settings and
excludes large folders like .venv to keep the LSP responsive.
You can point the bridge to a config file via:
export CODEX_LSP_BRIDGE_CONFIG=/path/to/config.toml
MCP setup (HTTP vs stdio)
HTTP (bridge runs separately):
experimental_use_rmcp_client = true
[mcp_servers.lsp_bridge]
url = "http://127.0.0.1:8000/mcp"
Stdio (Codex spawns the bridge, no HTTP):
[mcp_servers.lsp_bridge]
command = "/absolute/path/to/codex-lsp-bridge"
args = ["serve", "--transport", "stdio", "--config", "/absolute/path/to/config/default.toml"]
Multi-project behavior
- Each tool call includes a
file_path. - The bridge detects the workspace root by walking upward from that file and looking for markers like
.git,package.json,Cargo.toml, etc. - It starts an LSP per
(language, workspace_root)so multiple repos do not contaminate each other. - Absolute paths are recommended; relative paths resolve against the bridge's
default_root.
You can also force a workspace root by passing workspace_root to any file-based tool call.
Autofix + lint + strong types
The code_action tool returns quick fixes and refactors when a language server supports them.
For best results per language:
- Python: basedpyright/pyright for types. For lint+autofix, consider switching to
ruff-lsporpylspwith ruff/black plugins. - TypeScript/JavaScript: types come from
typescript-language-serverand yourtsconfig.json(usestrict: truefor strong types). For lint/autofix, use ESLint and its LSP server. - Rust: rust-analyzer provides strong types and code actions (e.g., import/quickfix).
- C/C++: clangd provides diagnostics and some fixes; accuracy depends on
compile_commands.json.
System prompt snippet
See SYSTEM_PROMPT_SNIPPET.md for a short snippet you can append to your AI system prompt so it always prefers LSP tools.
Security notes
- Local is safest: the bridge reads your repo files and launches local binaries.
- If you deploy remotely, only do so where the server has access to the same repo and protect it with auth.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.