Recursive Companion MCP

Recursive Companion MCP

An MCP server that implements iterative refinement of responses through self-critique cycles, breaking the process into discrete steps to avoid timeouts and show progress.

Category
Visit Server

README

Recursive Companion MCP

An MCP (Model Context Protocol) server that implements iterative refinement through self-critique cycles. Inspired by Hank Besser's recursive-companion, this implementation adds incremental processing to avoid timeouts and enable progress visibility.

Features

  • Incremental Refinement: Avoids timeouts by breaking refinement into discrete steps
  • Mathematical Convergence: Uses cosine similarity to measure when refinement is complete
  • Domain-Specific Optimization: Auto-detects and optimizes for technical, marketing, strategy, legal, and financial domains
  • Progress Visibility: Each step returns immediately, allowing UI updates
  • Parallel Sessions: Support for multiple concurrent refinement sessions

How It Works

The refinement process follows a Draft → Critique → Revise → Converge pattern:

  1. Draft: Generate initial response
  2. Critique: Create multiple parallel critiques (using faster models)
  3. Revise: Synthesize critiques into improved version
  4. Converge: Measure similarity and repeat until threshold reached

Installation

Prerequisites

  • Python 3.10+
  • uv package manager
  • AWS Account with Bedrock access
  • Claude Desktop app

Setup

  1. Clone the repository:
git clone https://github.com/yourusername/recursive-companion-mcp.git
cd recursive-companion-mcp
  1. Install dependencies:
uv sync
  1. Configure AWS credentials as environment variables or through AWS CLI

  2. Add to Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "recursive-companion": {
      "command": "/path/to/recursive-companion-mcp/run_server.sh",
      "env": {
        "AWS_REGION": "us-east-1",
        "AWS_ACCESS_KEY_ID": "your-key",
        "AWS_SECRET_ACCESS_KEY": "your-secret",
        "BEDROCK_MODEL_ID": "anthropic.claude-3-sonnet-20240229-v1:0",
        "CRITIQUE_MODEL_ID": "anthropic.claude-3-haiku-20240307-v1:0",
        "CONVERGENCE_THRESHOLD": "0.95",
        "PARALLEL_CRITIQUES": "2",
        "MAX_ITERATIONS": "5",
        "REQUEST_TIMEOUT": "600"
      }
    }
  }
}

Usage

The tool provides several MCP endpoints:

Start a refinement session

Use start_refinement to refine: "Explain the key principles of secure API design"

Continue refinement step by step

Use continue_refinement with session_id "abc123..."

Get final result

Use get_final_result with session_id "abc123..."

Other tools

  • get_refinement_status - Check progress without advancing
  • list_refinement_sessions - See all active sessions

Configuration

Environment Variable Default Description
BEDROCK_MODEL_ID anthropic.claude-3-sonnet-20240229-v1:0 Main generation model
CRITIQUE_MODEL_ID Same as BEDROCK_MODEL_ID Model for critiques (use Haiku for speed)
CONVERGENCE_THRESHOLD 0.98 Similarity threshold for convergence (0.90-0.99)
PARALLEL_CRITIQUES 3 Number of parallel critiques per iteration
MAX_ITERATIONS 10 Maximum refinement iterations
REQUEST_TIMEOUT 300 Timeout in seconds

Performance

With optimized settings:

  • Each iteration: 60-90 seconds
  • Typical convergence: 2-3 iterations
  • Total time: 2-4 minutes (distributed across multiple calls)

Using Haiku for critiques reduces iteration time by ~50%.

Architecture

┌─────────────┐     ┌──────────────┐     ┌─────────────┐
│   Claude    │────▶│  MCP Server  │────▶│   Bedrock   │
│  Desktop    │◀────│              │◀────│   Claude    │
└─────────────┘     └──────────────┘     └─────────────┘
                           │
                           ▼
                    ┌──────────────┐
                    │   Session    │
                    │   Manager    │
                    └──────────────┘

Development

Running tests

uv run pytest tests/

Local testing

uv run python test_incremental.py

Attribution

This project is inspired by recursive-companion by Hank Besser. The original implementation provided the conceptual Draft → Critique → Revise → Converge pattern. This MCP version adds:

  • Session-based incremental processing to avoid timeouts
  • AWS Bedrock integration for Claude and Titan embeddings
  • Domain auto-detection and specialized prompts
  • Mathematical convergence measurement
  • Support for different models for critiques vs generation

Contributing

Contributions are welcome! Please read our Contributing Guide for details.

License

MIT License - see LICENSE file for details.

Acknowledgments

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured