Councly MCP Server
Enables AI assistants to create council hearings where multiple LLMs (Claude, GPT, Gemini, Grok) debate topics and synthesize verdicts with trust scores and diverse perspectives.
README
@councly/mcp
MCP (Model Context Protocol) server for Councly - Multi-LLM Council Hearings.
Enable Claude Code, Codex, and other MCP-compatible AI assistants to invoke council hearings where multiple LLMs (Claude, GPT, Gemini, Grok) debate topics and synthesize verdicts.
Installation
npm install -g @councly/mcp
Or use directly with npx:
npx @councly/mcp
Setup
1. Get an API Key
- Sign in to Councly
- Go to Settings > MCP Integration
- Create a new API key
- Copy the key (shown only once)
2. Configure Claude Code
Add to your Claude Code settings (~/.claude/settings.json):
{
"mcpServers": {
"councly": {
"command": "npx",
"args": ["@councly/mcp"],
"env": {
"COUNCLY_API_KEY": "cnc_your_key_here"
}
}
}
}
3. Configure Codex CLI
export COUNCLY_API_KEY=cnc_your_key_here
Or add to your shell profile.
Tools
councly_hearing
Create a council hearing where multiple LLMs debate a topic.
Parameters:
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
| subject | string | Yes | - | The topic to discuss (10-10000 chars) |
| preset | string | No | balanced | Model preset: balanced, fast, coding, coding_plus |
| workflow | string | No | auto | Workflow: auto, discussion, review, brainstorming |
| wait | boolean | No | true | Wait for completion |
| timeout_seconds | number | No | 300 | Max wait time (30-600) |
Presets:
| Preset | Credits | Counsels | Best For |
|---|---|---|---|
| balanced | 9 | 3 | General purpose discussions |
| fast | 6 | 3 | Quick responses, simple topics |
| coding | 14 | 3 | Code review, technical decisions |
| coding_plus | 17 | 4 | Complex code problems |
Example:
Use councly_hearing with subject="Review this Python function for security issues:
def authenticate(username, password):
query = f'SELECT * FROM users WHERE username='{username}' AND password='{password}''
return db.execute(query)
" and preset="coding"
councly_status
Check the status of a hearing.
Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
| hearing_id | string (uuid) | Yes | The hearing ID |
Example:
Use councly_status with hearing_id="550e8400-e29b-41d4-a716-446655440000"
Response Format
Completed hearings return:
- Status: completed, failed, or early_stopped
- Verdict: Synthesized conclusion from the moderator
- Trust Score: 0-100 confidence rating
- Counsel Perspectives: Summary from each counsel
- Cost: Credits used
Error Handling
Common errors:
| Code | Description |
|---|---|
| INSUFFICIENT_BALANCE | Not enough credits |
| ACTIVE_HEARING_EXISTS | One hearing already in progress |
| RATE_LIMIT_EXCEEDED | Too many requests |
| CONTENT_BLOCKED | Subject contains prohibited content |
Environment Variables
| Variable | Required | Description |
|---|---|---|
| COUNCLY_API_KEY | Yes | Your MCP API key |
| COUNCLY_BASE_URL | No | API base URL (default: https://councly.ai) |
Pricing
Councly uses a credit-based pricing model:
- 1 credit = $0.01 USD
- Credits are deducted at hearing creation
- Failed hearings are refunded
Purchase credits at councly.ai/billing.
Links
License
Apache 2.0 - See LICENSE
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.