LLM Responses MCP Server
A Model Context Protocol server that enables collaborative debates between multiple AI agents, allowing them to discuss and reach consensus on user prompts.
README
LLM Responses MCP Server
A Model Context Protocol (MCP) server that enables collaborative debates between multiple AI agents, allowing them to discuss and reach consensus on user prompts.
Overview
This project implements an MCP server that facilitates multi-turn conversations between LLMs with these key features:
- Session-based collaboration - LLMs can register as participants in a debate session
- Deliberative consensus - LLMs can engage in extended discussions to reach agreement
- Real-time response sharing - All participants can view and respond to each other's contributions
The server provides four main tool calls:
register-participant: Allows an LLM to join a collaboration session with its initial responsesubmit-response: Allows an LLM to submit follow-up responses during the debateget-responses: Allows an LLM to retrieve all responses from other LLMs in the sessionget-session-status: Allows an LLM to check if the registration waiting period has completed
This enables a scenario where multiple AI agents (like the "Council of Ephors") can engage in extended deliberation about a user's question, debating with each other until they reach a solid consensus.
Installation
# Install dependencies
bun install
Development
# Build the TypeScript code
bun run build
# Start the server in development mode
bun run dev
Testing with MCP Inspector
The project includes support for the MCP Inspector, which is a tool for testing and debugging MCP servers.
# Run the server with MCP Inspector
bun run inspect
The inspect script uses npx to run the MCP Inspector, which will launch a web interface in your browser for interacting with your MCP server.
This will allow you to:
- Explore available tools and resources
- Test tool calls with different parameters
- View the server's responses
- Debug your MCP server implementation
Usage
The server exposes two endpoints:
/sse- Server-Sent Events endpoint for MCP clients to connect/messages- HTTP endpoint for MCP clients to send messages
MCP Tools
register-participant
Register as a participant in a collaboration session:
// Example tool call
const result = await client.callTool({
name: 'register-participant',
arguments: {
name: 'Socrates',
prompt: 'What is the meaning of life?',
initial_response: 'The meaning of life is to seek wisdom through questioning...',
persona_metadata: {
style: 'socratic',
era: 'ancient greece'
} // Optional
}
});
The server waits for a 3-second registration period after the last participant joins before responding. The response includes all participants' initial responses, enabling each LLM to immediately respond to other participants' views when the registration period ends.
submit-response
Submit a follow-up response during the debate:
// Example tool call
const result = await client.callTool({
name: 'submit-response',
arguments: {
sessionId: 'EPH4721R-Socrates', // Session ID received after registration
prompt: 'What is the meaning of life?',
response: 'In response to Plato, I would argue that...'
}
});
get-responses
Retrieve all responses from the debate session:
// Example tool call
const result = await client.callTool({
name: 'get-responses',
arguments: {
sessionId: 'EPH4721R-Socrates', // Session ID received after registration
prompt: 'What is the meaning of life?' // Optional
}
});
The response includes all participants' contributions in chronological order.
get-session-status
Check if the registration waiting period has elapsed:
// Example tool call
const result = await client.callTool({
name: 'get-session-status',
arguments: {
prompt: 'What is the meaning of life?'
}
});
Collaborative Debate Flow
- LLMs register as participants with their initial responses to the prompt
- The server waits 3 seconds after the last registration before sending responses
- When the registration period ends, all participants receive the compendium of initial responses from all participants
- Participants can then submit follow-up responses, responding to each other's points
- The debate continues until the participants reach a consensus or a maximum number of rounds is reached
License
MIT
Deployment to EC2
This project includes Docker configuration for easy deployment to EC2 or any other server environment.
Prerequisites
- An EC2 instance running Amazon Linux 2 or Ubuntu
- Security group configured to allow inbound traffic on port 62887
- SSH access to the instance
Deployment Steps
-
Clone the repository to your EC2 instance:
git clone <your-repository-url> cd <repository-directory> -
Make the deployment script executable:
chmod +x deploy.sh -
Run the deployment script:
./deploy.sh
The script will:
- Install Docker and Docker Compose if they're not already installed
- Build the Docker image
- Start the container in detached mode
- Display the public URL where your MCP server is accessible
Manual Deployment
If you prefer to deploy manually:
-
Build the Docker image:
docker-compose build -
Start the container:
docker-compose up -d -
Verify the container is running:
docker-compose ps
Accessing the Server
Once deployed, your MCP server will be accessible at:
http://<ec2-public-ip>:62887/sse- SSE endpointhttp://<ec2-public-ip>:62887/messages- Messages endpoint
Make sure port 62887 is open in your EC2 security group!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.