DebateTalk MCP
Enables AI assistants to facilitate structured multi-model debates that synthesize multiple perspectives into clear categories like ground truths and blind spots. It provides tools for running real-time debates, checking model health, and managing history via the Model Context Protocol.
README
DebateTalk MCP
Official MCP server and CLI for DebateTalk — run structured multi-model AI debates from your AI assistant or terminal.
DebateTalk makes multiple AI models argue a question independently, challenge each other's reasoning, and converge on a structured synthesis: Strong Ground, Fault Lines, Blind Spots, and Your Call.
Features
- MCP server — connect Claude Desktop, Cursor, or any MCP-compatible client to DebateTalk
- CLI — run debates and check model status from the terminal
- Streaming output — debates stream in real time via SSE
- 5 tools:
run_debate,get_model_status,recommend_models,estimate_cost,get_history
Quickstart
Claude Code — plugin marketplace
1. Add the DebateTalk marketplace:
/plugin marketplace add DebateTalk-AI/mcp
2. Install the plugin:
/plugin install debatetalk@debatetalk-mcp
3. Set your API key:
Get a key at console.debatetalk.ai/api-keys, then add it to ~/.claude/settings.json:
{
"pluginConfigs": {
"debatetalk@debatetalk-mcp": {
"options": {
"api_key": "dt_your_key_here"
}
}
}
}
Then run /reload-plugins — the five DebateTalk tools are immediately available in your session.
MCP (Claude Desktop, Cursor, Cline, Goose, and any MCP-compatible client)
1. Get an API key
Create a key at console.debatetalk.ai/api-keys. Requires a Pro or Enterprise plan. Free tier: 5 debates/day.
2. Add to your MCP client config
{
"mcpServers": {
"dt": {
"command": "npx",
"args": ["-y", "@debatetalk/mcp"],
"env": {
"DEBATETALK_API_KEY": "dt_your_key_here"
}
}
}
}
Config file locations:
- Claude Desktop (Mac):
~/Library/Application Support/Claude/claude_desktop_config.json - Claude Desktop (Windows):
%APPDATA%\Claude\claude_desktop_config.json - Claude Code:
~/.claude/settings.json(undermcpServers) - Cursor:
.cursor/mcp.jsonin your project root - Windsurf:
~/.codeium/windsurf/mcp_config.json - Cline / Roo Code: MCP settings panel in VS Code extension
- Goose:
~/.config/goose/config.yaml(underextensions) - Other clients: refer to your client's MCP documentation
3. Ask your AI assistant to run a debate
MCP clients read the tool description to decide when to call it — no exact phrasing required. Any of these work:
"debate whether we should rewrite our backend in Go" "use DT — should we raise our Series A now?" "multi-model this: is Rust worth learning in 2026?" "stress-test this architecture decision" "get a second opinion on moving to microservices"
Claude will also invoke it proactively for high-stakes decisions where a single AI answer is insufficient.
CLI
Install globally:
npm install -g @debatetalk/mcp
Set your API key:
export DEBATETALK_API_KEY=dt_your_key_here
Run a debate:
dt debate "Should we adopt microservices?"
Check which models are online:
dt models
Get a recommended model panel for your question:
dt recommend "Is Rust worth learning in 2026?"
Estimate cost before running:
dt cost "Should we raise our Series A now?"
View past debates:
dt history
dt history --limit 5
MCP Tools Reference
| Tool | Auth required | Description |
|---|---|---|
run_debate |
Yes | Run a structured multi-model debate (streaming) |
get_model_status |
No | Real-time health and latency for all models |
recommend_models |
No | Get the best model panel for your question |
estimate_cost |
Yes | Estimate credit cost before running |
get_history |
Yes | List your past debates |
run_debate
question string required The question or topic to debate
models array optional Specific model IDs to use (omit for smart routing)
rounds number optional Number of deliberation rounds (default: 2)
get_model_status
No parameters. Returns live health, latency, and uptime per model.
recommend_models
question string required The question — routing picks the strongest panel
estimate_cost
question string required
models array optional
rounds number optional
get_history
limit number optional Number of debates to return (default: 20, max: 100)
Configuration
| Variable | Required | Description |
|---|---|---|
DEBATETALK_API_KEY |
For authenticated tools | Your API key from console.debatetalk.ai |
Public tools (get_model_status, recommend_models) work without an API key.
Plans & Limits
| Plan | Debates/day | API keys | Debaters |
|---|---|---|---|
| Free | 5 | — | 3 |
| Pro | Unlimited | 2 | 5 |
| Enterprise | Unlimited | Unlimited | 10 |
Development
git clone https://github.com/DebateTalk-AI/mcp
cd mcp
npm install
npm run build
npm test
Run MCP server locally:
DEBATETALK_API_KEY=dt_your_key npm run dev:mcp
Run CLI locally:
DEBATETALK_API_KEY=dt_your_key npm run dev:cli -- debate "your question"
Contributing
See CONTRIBUTING.md. Issues and PRs welcome.
License
MIT — see LICENSE.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.