A2AMCP

A2AMCP

A Redis-backed MCP server that enables multiple AI agents to communicate, coordinate, and collaborate while working on parallel development tasks, preventing conflicts in shared codebases.

Category
Visit Server

README

A2AMCP - Agent-to-Agent Model Context Protocol

License: MIT Docker Redis Status

Enabling Seamless Multi-Agent Collaboration for AI-Powered Development

A2AMCP brings Google's Agent-to-Agent (A2A) communication concepts to the Model Context Protocol (MCP) ecosystem, enabling AI agents to communicate, coordinate, and collaborate in real-time while working on parallel development tasks.

Originally created for SplitMind, A2AMCP solves the critical problem of isolated AI agents working on the same codebase without awareness of each other's changes.

✅ Server Status: WORKING! All 17 tools implemented and tested. Uses modern MCP SDK 1.9.3.

🚀 Quick Start

Using Docker (Recommended)

# Clone the repository
git clone https://github.com/webdevtodayjason/A2AMCP
cd A2AMCP

# Start the server
docker-compose up -d

# Verify it's running
docker ps | grep splitmind

# Test the connection
python verify_mcp.py

Configure Your Agents

Claude Code (CLI)

# Add the MCP server using Claude Code CLI
claude mcp add splitmind-a2amcp \
  -e REDIS_URL=redis://localhost:6379 \
  -- docker exec -i splitmind-mcp-server python /app/mcp-server-redis.py

Claude Desktop

Add to your configuration file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "splitmind-a2amcp": {
      "command": "docker",
      "args": ["exec", "-i", "splitmind-mcp-server", "python", "/app/mcp-server-redis.py"],
      "env": {
        "REDIS_URL": "redis://redis:6379"
      }
    }
  }
}

🎯 What Problem Does A2AMCP Solve?

When multiple AI agents work on the same codebase:

  • Without A2AMCP: Agents create conflicting code, duplicate efforts, and cause merge conflicts
  • With A2AMCP: Agents coordinate, share interfaces, prevent conflicts, and work as a team

Generic Use Cases Beyond SplitMind

A2AMCP can coordinate any multi-agent scenario:

  • Microservices: Different agents building separate services
  • Full-Stack Apps: Frontend and backend agents collaborating
  • Documentation: Multiple agents creating interconnected docs
  • Testing: Test writers coordinating with feature developers
  • Refactoring: Agents working on different modules simultaneously

🏗️ Architecture

┌─────────────────┐
│   A2AMCP Server │ ← Persistent Redis-backed MCP server
│   (Port 5050)   │   handling all agent communication
└────────┬────────┘
         │ STDIO Protocol (MCP)
    ┌────┴────┬─────────┬─────────┐
    ▼         ▼         ▼         ▼
┌────────┐┌────────┐┌────────┐┌────────┐
│Agent 1 ││Agent 2 ││Agent 3 ││Agent N │
│Auth    ││Profile ││API     ││Frontend│
└────────┘└────────┘└────────┘└────────┘

🔧 Core Features

1. Real-time Agent Communication

  • Direct queries between agents
  • Broadcast messaging
  • Async message queues

2. File Conflict Prevention

  • Automatic file locking
  • Conflict detection
  • Negotiation strategies

3. Shared Context Management

  • Interface/type registry
  • API contract sharing
  • Dependency tracking

4. Task Transparency

  • Todo list management
  • Progress visibility
  • Completion tracking
  • Task completion signaling

5. Multi-Project Support

  • Isolated project namespaces
  • Redis-backed persistence
  • Automatic cleanup

6. Modern MCP Integration

  • Uses MCP SDK 1.9.3 with proper decorators
  • @server.list_tools() and @server.call_tool() patterns
  • STDIO-based communication protocol
  • Full A2AMCP API compliance with 17 tools implemented

📦 Installation Options

Docker Compose (Production)

services:
  mcp-server:
    build: .
    container_name: splitmind-mcp-server
    ports:
      - "5050:5000"  # Changed from 5000 to avoid conflicts
    environment:
      - REDIS_URL=redis://redis:6379
      - LOG_LEVEL=INFO
    depends_on:
      redis:
        condition: service_healthy
    restart: unless-stopped
  
  redis:
    image: redis:7-alpine
    container_name: splitmind-redis
    ports:
      - "6379:6379"
    volumes:
      - redis-data:/data
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 10s
      timeout: 5s
      retries: 5

volumes:
  redis-data:
    driver: local

Python SDK

pip install a2amcp-sdk

JavaScript/TypeScript SDK (Coming Soon)

npm install @a2amcp/sdk

🚦 Usage Example

Python SDK

from a2amcp import A2AMCPClient, Project, Agent

async def run_agent():
    client = A2AMCPClient("localhost:5000")
    project = Project(client, "my-app")
    
    async with Agent(project, "001", "feature/auth", "Build authentication") as agent:
        # Agent automatically registers and maintains heartbeat
        
        # Coordinate file access
        async with agent.files.coordinate("src/models/user.ts") as file:
            # File is locked, safe to modify
            pass
        # File automatically released
        
        # Share interfaces
        await project.interfaces.register(
            agent.session_name,
            "User",
            "interface User { id: string; email: string; }"
        )

Direct MCP Tool Usage

# Register agent
register_agent("my-project", "task-001", "001", "feature/auth", "Building authentication")

# Query another agent
query_agent("my-project", "task-001", "task-002", "interface", "What's the User schema?")

# Share interface
register_interface("my-project", "task-001", "User", "interface User {...}")

📚 Documentation

🛠️ SDKs and Tools

Available Now

  • Python SDK: Full-featured SDK with async support
  • Docker Deployment: Production-ready containers

In Development

  • JavaScript/TypeScript SDK: For Node.js and browser
  • CLI Tools: Command-line interface for monitoring
  • Go SDK: High-performance orchestration
  • Testing Framework: Mock servers and test utilities

See SDK Development Progress for details.

🤝 Integration with AI Frameworks

A2AMCP is designed to work with:

  • SplitMind - Original use case
  • Claude Code (via MCP)
  • Any MCP-compatible AI agent
  • Future: LangChain, CrewAI, AutoGen

🔍 How It Differs from A2A

While inspired by Google's A2A protocol, A2AMCP makes specific design choices for AI code development:

Feature Google A2A A2AMCP
Protocol HTTP-based MCP tools
State Stateless Redis persistence
Focus Generic tasks Code development
Deployment Per-agent servers Single shared server

🚀 Roadmap

  • [x] Core MCP server with Redis
  • [x] Modern MCP SDK 1.9.3 integration
  • [x] Fixed decorator patterns (@server.list_tools(), @server.call_tool())
  • [x] Python SDK
  • [x] Docker deployment
  • [x] All 17 A2AMCP API tools implemented and tested
  • [x] Health check endpoint for monitoring
  • [x] Verification script for testing connectivity
  • [ ] JavaScript/TypeScript SDK
  • [ ] CLI monitoring tools
  • [ ] SplitMind native integration
  • [ ] Framework adapters (LangChain, CrewAI)
  • [ ] Enterprise features

🛠️ Troubleshooting

Agents can't see mcp__splitmind-a2amcp__ tools

  1. Restart Claude Desktop - MCP connections are established at startup
  2. Verify server is running: docker ps | grep splitmind
  3. Check health endpoint: curl http://localhost:5050/health
  4. Run verification script: python verify_mcp.py
  5. Check configuration: Ensure ~/Library/Application Support/Claude/claude_desktop_config.json contains the A2AMCP server configuration

Common Issues

  • "Tool 'X' not yet implemented" - Fixed in latest version, pull latest changes
  • Connection failed - Ensure Docker is running and ports 5050/6379 are free
  • Redis connection errors - Wait for Redis to be ready (takes ~5-10 seconds on startup)

🤝 Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

Development Setup

# Clone repository
git clone https://github.com/webdevtodayjason/A2AMCP
cd A2AMCP

# Install dependencies
pip install -r requirements.txt

# Run tests
pytest

# Start development server
docker-compose -f docker-compose.dev.yml up

📊 Performance

  • Handles 100+ concurrent agents
  • Sub-second message delivery
  • Automatic cleanup of dead agents
  • Horizontal scaling ready

🔒 Security

  • Project isolation
  • Optional authentication (coming soon)
  • Encrypted communication (roadmap)
  • Audit logging

📄 License

MIT License - see LICENSE file.

🙏 Acknowledgments

📞 Support


A2AMCP - Turning isolated AI agents into coordinated development teams

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured