OpenCode MCP Tool

OpenCode MCP Tool

Enables AI assistants to interact with multiple AI models through the OpenCode CLI with a unified interface. Supports plan mode for structured analysis, flexible model selection, and natural language queries with file references.

Category
Visit Server

README

OpenCode MCP Tool

Documentation available in docs/ – Examples, FAQ, troubleshooting, best practices

This is a Model Context Protocol (MCP) server that allows AI assistants to interact with the OpenCode CLI. It enables AI assistants to leverage multiple AI models through a unified interface, with features like plan mode for structured thinking and extensive model selection.

  • Ask questions through multiple AI models via Claude or other MCP clients
  • Use plan mode for structured analysis and safer operations

TLDR

Use OpenCode's multi-model capabilities directly in Claude Code with flexible model selection and plan mode features.

Prerequisites

Before using this tool, ensure you have:

  1. Node.js (v16.0.0 or higher)
  2. OpenCode CLI installed and configured

One-Line Setup (Claude Code)

claude mcp add opencode -- npx -y @gilby125/opencode-mcp-tool -- --model google/gemini-2.5-pro

Codex CLI Setup

If you're using the OpenAI Codex CLI as your MCP client, add this server with:

codex mcp add opencode -- npx -y @gilby125/opencode-mcp-tool -- --model google/gemini-2.5-pro --fallback-model google/gemini-2.5-flash

After that, start Codex CLI as usual and you can talk to OpenCode via natural language (for example, "use opencode to explain index.html") and let Codex call the MCP tools when helpful.

Verify Installation

Type /mcp inside Claude Code to verify the opencode-cli MCP is active.


Alternative: Import from Claude Desktop

If you already have it configured in Claude Desktop:

  1. Add to your Claude Desktop config:
"opencode": {
  "command": "npx",
  "args": ["-y", "@gilby125/opencode-mcp-tool", "--", "--model", "google/gemini-2.5-pro"]
}
  1. Import to Claude Code:
claude mcp add-from-claude-desktop

Configuration

Register the MCP server with your MCP client. Note: The server requires a primary model to be specified.

For NPX Usage (Recommended)

Add this configuration to your Claude Desktop config file:

{
  "mcpServers": {
    "opencode": {
      "command": "npx",
      "args": ["-y", "@gilby125/opencode-mcp-tool", "--", "--model", "google/gemini-2.5-pro", "--fallback-model", "google/gemini-2.5-flash"]
    }
  }
}

For Global Installation

If you installed globally, use this configuration instead:

{
  "mcpServers": {
    "opencode": {
      "command": "opencode-mcp",
      "args": ["--model", "google/gemini-2.5-pro", "--fallback-model", "google/gemini-2.5-flash"]
    }
  }
}

Codex CLI Configuration

To use this server with the OpenAI Codex CLI, add the following to your ~/.codex/config.toml:

[mcp_servers.opencode]
command = "opencode-mcp"
args = ["--model", "google/gemini-2.5-pro", "--fallback-model", "google/gemini-2.5-flash"]
startup_timeout_sec = 20
tool_timeout_sec = 300

After updating the Codex config, restart codex and test with a command such as:

/opencode:ping "Hello from OpenCode MCP!"

Configuration File Locations:

  • Claude Desktop:
    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
    • Linux: ~/.config/claude/claude_desktop_config.json

After updating the configuration, restart your terminal session.

Example Workflow

  • Natural language: "use opencode to explain index.html", "understand the massive project using opencode", "ask opencode to search for latest news"
  • Claude Code: Type /opencode and commands will populate in Claude Code's interface.

Usage Examples

With File References (using @ syntax)

  • ask opencode to analyze @src/main.js and explain what it does
  • use opencode to summarize @. the current directory
  • analyze @package.json and tell me about dependencies

General Questions (without files)

  • ask opencode to search for the latest tech news
  • use opencode to explain div centering
  • ask opencode about best practices for React development related to @file_im_confused_about

Using OpenCode's Plan Mode

The plan mode allows you to safely test code changes, run scripts, or execute potentially risky operations with structured planning.

  • use opencode plan mode to create and run a Python script that processes data
  • ask opencode to safely test @script.py and explain what it does
  • use opencode plan mode to install numpy and create a data visualization
  • test this code safely: Create a script that makes HTTP requests to an API

Tools (for the AI)

These tools are designed to be used by the AI assistant.

  • ask-opencode: Execute OpenCode with model selection and mode control. Uses plan mode by default for structured analysis.
    • prompt (required): The analysis request. Use the @ syntax to include file or directory references (e.g., @src/main.js explain this code) or ask general questions (e.g., Please use a web search to find the latest news stories).
    • model (optional): The model to use. If not specified, uses the primary model configured at server startup.
    • mode (optional): Execution mode - 'plan' for structured analysis (default), 'build' for immediate execution, or custom mode string.
  • brainstorm: Generate novel ideas with dynamic context gathering using creative frameworks (SCAMPER, Design Thinking, etc.), domain context integration, idea clustering, feasibility analysis, and iterative refinement.
    • prompt (required): Primary brainstorming challenge or question to explore.
    • methodology (optional): Brainstorming framework - 'divergent', 'convergent', 'scamper', 'design-thinking', 'lateral', or 'auto' (default).
    • domain (optional): Domain context (e.g., 'software', 'business', 'creative', 'research').
    • ideaCount (optional): Target number of ideas to generate (default: 12).
    • includeAnalysis (optional): Include feasibility and impact analysis (default: true).
  • timeout-test: Test timeout prevention by running for a specified duration.
    • duration (required): Duration in milliseconds for the test.
  • ping: Echo test tool that returns a message.
    • prompt (optional): Message to echo back.
  • Help: Shows the OpenCode CLI help text.

Slash Commands (for the User)

You can use these commands directly in Claude Code's interface (compatibility with other clients has not been tested).

  • /plan: Execute OpenCode in plan mode for structured analysis and safer operations.
    • prompt (required): Analysis request (e.g., /plan prompt:Create and run a Python script that processes CSV data or /plan prompt:@script.py Analyze this script safely).
  • /build: Execute OpenCode in immediate execution mode for direct implementation.
    • prompt (required): Implementation request for immediate code execution.
  • /help: Displays the OpenCode CLI help information.
  • /ping: Tests the connection to the server.
    • prompt (optional): A message to echo back.

Contributing

Contributions are welcome! Please see our Contributing Guidelines for details on how to submit pull requests, report issues, and contribute to the project.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured