OpenCode MCP Server
Integrates the OpenCode AI coding agent into MCP-compatible clients, allowing users to execute terminal-based coding tasks and manage sessions programmatically. It provides tools for running commands, listing AI models, and continuing existing coding sessions via the OpenCode CLI.
README
OpenCode MCP Server
An MCP (Model Context Protocol) server that provides seamless integration with OpenCode, the open-source AI coding agent for the terminal.
Features
- Execute OpenCode Commands: Run any OpenCode CLI command programmatically
- Session Management: Create, continue, and export coding sessions
- Model Discovery: List available AI models from all configured providers
- Async Execution: Non-blocking command execution with timeout handling
- JSON Lines Parsing: Robust parsing of OpenCode's streaming output format
Tools Available
| Tool | Description |
|---|---|
execute_opencode_command |
Execute any OpenCode CLI command with full flexibility |
opencode_run |
Run OpenCode with a simple prompt message |
opencode_continue_session |
Continue an existing OpenCode session |
opencode_list_models |
List available models, optionally filtered by provider |
opencode_export_session |
Export session data as JSON |
opencode_get_status |
Check OpenCode CLI availability and status |
Installation
Prerequisites
- Python 3.10+
- OpenCode CLI installed and configured
- MCP-compatible client (Claude Desktop, etc.)
Install Dependencies
pip install -r requirements.txt
Configure MCP Client
Add to your MCP client configuration (e.g., ~/.claude.json or Claude Desktop settings):
{
"mcpServers": {
"opencode": {
"command": "python",
"args": ["-m", "src.services.fast_mcp.opencode_server"],
"cwd": "/path/to/opencode-mcp"
}
}
}
Usage
Basic Usage
Once configured, the MCP tools are available through your MCP client:
# Run a coding task
opencode_run(message="Create a Python function that calculates fibonacci numbers")
# List available models
opencode_list_models(provider="anthropic")
# Continue a previous session
opencode_continue_session(session_id="abc123", message="Now add unit tests")
# Check status
opencode_get_status()
Tool Parameters
execute_opencode_command
{
"prompt": str, # Required: The prompt/task for OpenCode
"model": str, # Optional: Model in provider/model format (e.g., "anthropic/claude-sonnet-4-20250514")
"agent": str, # Optional: Agent to use (e.g., "build", "plan")
"session": str, # Optional: Session ID to continue
"continue_session": bool, # Optional: Whether to continue last session
"timeout": int # Optional: Timeout in seconds (default: 300, max: 600)
}
opencode_run
{
"message": str, # Required: Message/prompt to send
"model": str, # Optional: Model to use
"agent": str, # Optional: Agent to use
"files": [str], # Optional: Files to attach
"timeout": int # Optional: Timeout in seconds
}
opencode_continue_session
{
"session_id": str, # Required: Session ID to continue
"message": str, # Optional: Follow-up message
"timeout": int # Optional: Timeout in seconds
}
Configuration
Environment variables (prefix: OPENCODE_):
| Variable | Default | Description |
|---|---|---|
OPENCODE_COMMAND |
opencode |
Path to OpenCode CLI |
OPENCODE_DEFAULT_MODEL |
None | Default model to use |
OPENCODE_DEFAULT_AGENT |
None | Default agent to use |
OPENCODE_DEFAULT_TIMEOUT |
300 |
Default timeout (seconds) |
OPENCODE_MAX_TIMEOUT |
600 |
Maximum timeout (seconds) |
OPENCODE_SERVER_LOG_LEVEL |
INFO |
Logging level |
Architecture
src/services/fast_mcp/opencode_server/
├── __init__.py
├── __main__.py # Entry point
├── server.py # MCP server & tool definitions
├── opencode_executor.py # CLI execution wrapper
├── models.py # Pydantic models
├── settings.py # Configuration
└── handlers/
├── __init__.py
├── execution.py # Run/continue operations
├── session.py # Session management
└── discovery.py # Model/status discovery
Development
Running Tests
pytest tests/ -v
Code Formatting
black src/
ruff check src/
Roadmap
Planned features for v2.0:
- [ ]
opencode_import_session- Import sessions from JSON/URL - [ ]
opencode_list_sessions- List all sessions with filtering - [ ]
opencode_get_stats- Usage statistics and cost tracking - [ ]
opencode_list_agents- List available agents - [ ]
opencode_github_run- GitHub Actions integration (async) - [ ]
opencode_pr_checkout- PR workflow support
Contributing
Contributions are welcome! Please read the contributing guidelines before submitting PRs.
License
MIT License - see LICENSE for details.
Related Projects
- OpenCode - The AI coding agent this server integrates with
- Model Context Protocol - The protocol specification
- MCP SDK - Python SDK for MCP
Acknowledgments
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.