claude-concilium
Multi-agent AI consultation framework for Claude Code via MCP — get second opinions from OpenAI, Gemini, Qwen, DeepSeek
README
Claude Concilium
Multi-agent AI consultation framework for Claude Code via MCP.
Get a second (and third) opinion from other LLMs when Claude Code alone isn't enough.
Claude Code ──┬── OpenAI (Codex CLI) ──► Opinion A
├── Gemini (gemini-cli) ─► Opinion B
│
└── Synthesis ◄── Consensus or iterate
The Problem
Claude Code is powerful, but one brain can miss bugs, overlook edge cases, or get stuck in a local optimum. Critical decisions benefit from diverse perspectives.
The Solution
Concilium runs parallel consultations with multiple LLMs through standard MCP protocol. Each LLM server wraps a CLI tool — no API keys needed for the primary providers (they use OAuth).
Key features:
- Parallel consultation with 2+ AI agents
- Production-grade fallback chains with error detection
- Each MCP server works standalone or as part of Concilium
- Plug & play: clone,
npm install, add to.mcp.json
Architecture
┌─────────────────────────────────────────────────────────┐
│ Claude Code │
│ │
│ "Review this code for race conditions" │
│ │
│ ┌──────────────┐ ┌──────────────┐ │
│ │ MCP Call #1 │ │ MCP Call #2 │ (parallel) │
│ └──────┬───────┘ └──────┬───────┘ │
│ │ │ │
└─────────┼──────────────────┼──────────────────────────────┘
│ │
▼ ▼
┌──────────────┐ ┌──────────────┐
│ mcp-openai │ │ mcp-gemini │ Primary agents
│ (codex exec)│ │ (gemini -p) │
└──────┬───────┘ └──────┬───────┘
│ │
▼ ▼
┌──────────────┐ ┌──────────────┐
│ OpenAI │ │ Google │ LLM providers
│ (OAuth) │ │ (OAuth) │
└──────────────┘ └──────────────┘
Fallback chain (on quota/error):
OpenAI → Qwen → DeepSeek
Gemini → Qwen → DeepSeek
Quickstart
1. Clone and install
git clone https://github.com/spyrae/claude-concilium.git
cd claude-concilium
# Install dependencies for each server
cd servers/mcp-openai && npm install && cd ../..
cd servers/mcp-gemini && npm install && cd ../..
cd servers/mcp-qwen && npm install && cd ../..
# Verify all servers work (no CLI tools required)
node test/smoke-test.mjs
Expected output:
PASS mcp-openai (Tools: openai_chat, openai_review)
PASS mcp-gemini (Tools: gemini_chat, gemini_analyze)
PASS mcp-qwen (Tools: qwen_chat)
All tests passed.
2. Set up providers
Pick at least 2 providers:
| Provider | Auth | Free Tier | Setup |
|---|---|---|---|
| OpenAI | codex login (OAuth) |
ChatGPT Plus weekly credits | Setup guide |
| Gemini | Google OAuth | 1000 req/day | Setup guide |
| Qwen | qwen login or API key |
Varies | Setup guide |
| DeepSeek | API key | Pay-per-use (cheap) | Setup guide |
3. Add to Claude Code
Copy config/mcp.json.example and update paths:
# Edit the example with your actual paths
cp config/mcp.json.example .mcp.json
# Update "/path/to/claude-concilium" with actual path
Or add servers individually to your existing .mcp.json:
{
"mcpServers": {
"mcp-openai": {
"type": "stdio",
"command": "node",
"args": ["/absolute/path/to/servers/mcp-openai/server.js"],
"env": {
"CODEX_HOME": "~/.codex-minimal"
}
},
"mcp-gemini": {
"type": "stdio",
"command": "node",
"args": ["/absolute/path/to/servers/mcp-gemini/server.js"]
}
}
}
4. Install the skill (optional)
Copy the Concilium skill to your Claude Code commands:
cp skill/ai-concilium.md ~/.claude/commands/ai-concilium.md
Now use /ai-concilium in Claude Code to trigger a multi-agent consultation.
MCP Servers
Each server can be used independently — you don't need all of them.
| Server | CLI Tool | Auth | Tools |
|---|---|---|---|
| mcp-openai | codex |
OAuth (ChatGPT Plus) | openai_chat, openai_review |
| mcp-gemini | gemini |
Google OAuth | gemini_chat, gemini_analyze |
| mcp-qwen | qwen |
API key / CLI login | qwen_chat |
DeepSeek uses the existing deepseek-mcp-server npm package — no custom server needed.
How It Works
Consultation Flow
- Formulate — describe the problem concisely (under 500 chars)
- Send in parallel — OpenAI + Gemini get the same prompt
- Handle errors — if a provider fails, fallback chain kicks in (Qwen → DeepSeek)
- Synthesize — compare responses, find consensus
- Iterate (optional) — resolve disagreements with follow-up questions
- Decide — apply the synthesized solution
Error Detection
All servers detect provider-specific errors and return structured responses:
| Error Type | Meaning | Action |
|---|---|---|
QUOTA_EXCEEDED |
Rate/credit limit hit | Use fallback provider |
AUTH_EXPIRED / AUTH_REQUIRED |
Token needs refresh | Re-authenticate CLI |
MODEL_NOT_SUPPORTED |
Model unavailable on plan | Use default model |
| Timeout | Process hung | Auto-killed, use fallback |
Fallback Chain
Primary: OpenAI ──────────────► Response
(QUOTA_EXCEEDED?)
│
Fallback 1: Qwen ──┴────────────► Response
(timeout?)
│
Fallback 2: DeepSeek ───────────► Response (always available)
When to Use Concilium
| Scenario | Recommended Agents |
|---|---|
| Code review | OpenAI + Gemini (parallel) |
| Architecture decision | OpenAI + Gemini → iterate if disagree |
| Stuck bug (3+ attempts) | All available agents |
| Performance optimization | Gemini (1M context) + OpenAI |
| Security review | OpenAI + Gemini + manual verification |
Customization
See docs/customization.md for:
- Adding your own LLM provider
- Modifying the fallback chain
- MCP server template
- Custom prompt strategies
Documentation
- Architecture — flow diagrams, error handling, design decisions
- OpenAI Setup — Codex CLI, ChatGPT Plus, minimal config
- Gemini Setup — gemini-cli, Google OAuth
- Qwen Setup — Qwen CLI, DashScope
- DeepSeek Setup — API key, npm package
- Customization — add your own LLM, modify chains
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.