AI Collaboration MCP Server
Enables collaboration with multiple AI providers (Claude, GPT-4, Gemini, Ollama) directly from VS Code with automatic project context injection and persistent conversation history. Provides streamlined tools for getting AI advice, multi-provider research, and enhanced context sharing across sessions.
README
AI Collaboration MCP Server
A streamlined Model Context Protocol (MCP) server that provides enhanced AI collaboration tools for VS Code with automatic project context injection and conversation history.
🚀 Features
- Multi-Provider Support: Claude, GPT-4, Gemini, and Ollama
- Workspace-Specific Conversation History: Each project gets isolated conversation memory
- Automatic Context Injection: Project files, structure, and README automatically included
- Dynamic Workspace Management: Switch between projects seamlessly
- API Call Management: Rate limiting (3 calls per provider per hour)
- Streamlined Tools: Just 4 essential tools that work together
🛠️ Tools Available
1. #set_workspace
Set the current workspace directory for project-specific conversation history and context.
Usage in VS Code:
@workspace use #set_workspace with workspace_path="/path/to/your/project"
2. #consult_ai
Get expert advice from a specific AI provider with full project context.
Usage in VS Code:
@workspace use #consult_ai with claude about error handling best practices
3. #multi_ai_research
Get perspectives from multiple AI providers on complex questions.
Usage in VS Code:
@workspace use #multi_ai_research to analyze authentication approaches
4. #mandatory_execute
Force tool execution with explicit commands.
Usage in VS Code:
@workspace !consult_ai
@workspace use #multi_ai_research
📦 Installation
Prerequisites
- Node.js 18+
- VS Code with MCP support
- API keys for desired AI providers
1. Clone and Setup
git clone https://github.com/yourusername/ai-collaboration-mcp-server.git
cd ai-collaboration-mcp-server
npm install
2. Configure Environment Variables
Create a .env file:
# AI Provider API Keys (add the ones you want to use)
ANTHROPIC_API_KEY=your_claude_key_here
OPENAI_API_KEY=your_openai_key_here
GEMINI_API_KEY=your_gemini_key_here
# Ollama Configuration (for local AI)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2:latest
3. Build the Server
npm run build
4. Configure VS Code MCP
Option A: Workspace-specific (recommended for testing)
Create .vscode/mcp.json in your project:
{
"servers": {
"ai-collaboration": {
"type": "stdio",
"command": "node",
"args": ["/path/to/ai-collaboration-mcp-server/build/index.js"],
"env": {
"ANTHROPIC_API_KEY": "your_key_here",
"OPENAI_API_KEY": "your_key_here",
"GEMINI_API_KEY": "your_key_here",
"OLLAMA_BASE_URL": "http://localhost:11434"
}
}
}
}
Option B: Global configuration (for all projects)
Create ~/.vscode/mcp.json:
{
"servers": {
"ai-collaboration": {
"type": "stdio",
"command": "node",
"args": ["/absolute/path/to/ai-collaboration-mcp-server/build/index.js"],
"env": {
"ANTHROPIC_API_KEY": "your_key_here",
"OPENAI_API_KEY": "your_key_here",
"GEMINI_API_KEY": "your_key_here",
"OLLAMA_BASE_URL": "http://localhost:11434"
}
}
}
}
5. Enable MCP Auto-start (Optional)
Add to your VS Code settings.json:
{
"chat.mcp.autostart": "newAndOutdated"
}
🎯 Usage
First Time Setup Per Project:
- Restart VS Code after configuration
- Open VS Code chat (sidebar or
Cmd+Shift+I) - Set workspace for your project:
@workspace use #set_workspace with workspace_path="/full/path/to/your/project"
Daily Usage:
- Use the AI tools:
@workspace use #consult_ai with claude about my code@workspace use #multi_ai_research to compare approaches@workspace !consult_ai(force execution)
When Switching Projects:
- Set new workspace:
@workspace use #set_workspace with workspace_path="/path/to/other/project"
💡 Tip: Each project gets its own .mcp-conversation-history.json file for isolated conversation memory.
⚠️ Important Syntax Note
When using @workspace in VS Code, MCP tool names must be prefixed with #:
✅ Correct: @workspace use #consult_ai with claude about my code
❌ Wrong: @workspace use consult_ai with claude about my code
Without @workspace, no # is needed:
✅ Also correct: use consult_ai with claude about my code
🔧 Development
Run in Development Mode
npm run dev
Test the Server
npm test
Debug with MCP Inspector
npx @modelcontextprotocol/inspector node build/index.js
🧠 How It Works
Enhanced Context Injection
Every tool call automatically includes:
- Project structure and files
- README and package.json content
- Relevant conversation history
- Current workspace context
Conversation History
- Persistent file-based history (
.mcp-conversation-history.json) - Smart relevance filtering
- Cross-session context continuity
API Management
- Rate limiting per provider (3 calls/hour)
- Automatic retry with exponential backoff
- Clear error handling and user feedback
🔒 Security Notes
- API keys are stored in MCP configuration (keep them secure)
- Conversation history is stored locally
- No data sent to external services except AI provider APIs
🤝 Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
📄 License
MIT License - see LICENSE file for details
🆘 Troubleshooting
MCP Server Won't Start
- Check
Cmd+Shift+P→ "MCP: List Servers" - Verify file paths in configuration
- Check VS Code Output panel for errors
- Ensure Node.js and dependencies are installed
API Keys Not Working
- Verify keys are correctly set in MCP configuration
- Check for typos or extra spaces
- Ensure keys have proper permissions
Tools Not Appearing
- Restart VS Code completely
- Try
@workspacein chat to trigger MCP loading - Check MCP server logs for errors
🌟 Why This Approach?
This streamlined server demonstrates that smart consolidation beats feature proliferation:
- 3 core tools instead of 7+ specialized ones
- Enhanced context shared across all tools
- Easier maintenance and debugging
- Better user experience with consistent functionality
- Reduced cognitive load - focus on what you want, not which tool to use
Perfect for teams wanting powerful AI collaboration without complexity!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.