AI Peer Review MCP Server

AI Peer Review MCP Server

Enables local language models to request real-time peer review feedback from Google Gemini to improve response quality, accuracy, and completeness through AI collaboration.

Category
Visit Server

README

AI Peer Review MCP Server

Enhance your local LLM responses with real-time peer review from Google Gemini

A Model Context Protocol (MCP) server that enables local language models to request peer review feedback from Google Gemini, dramatically improving response quality through AI collaboration.

๐ŸŒŸ Features

  • Real-time peer review from Google Gemini for any local LLM response
  • Manual trigger system - user controls when to request peer review
  • Detailed feedback analysis - accuracy, completeness, clarity, and improvement suggestions
  • Comprehensive logging - see exactly what feedback Gemini provides
  • Privacy-conscious - only shares content when explicitly requested
  • Free to use - leverages Google Gemini's free tier
  • Easy integration - works with any MCP-compatible local LLM setup

๐ŸŽฏ Use Cases

  • Fact-checking complex or technical responses
  • Quality improvement for educational content
  • Writing enhancement for creative tasks
  • Technical validation for coding explanations
  • Research assistance with multiple AI perspectives

๐Ÿ“‹ Prerequisites

  • Python 3.8+ installed on your system
  • LMStudio (or another MCP-compatible LLM client)
  • Google AI Studio account (free) for Gemini API access
  • Local LLM with tool calling support (e.g., Llama 3.1, Mistral, Qwen)

๐Ÿš€ Quick Start

1. Get Google Gemini API Key

  1. Visit Google AI Studio
  2. Sign in with your Google account
  3. Click "Get API key" โ†’ "Create API key in new project"
  4. Copy your API key (starts with AIza...)

2. Install the MCP Server

# Clone or create project directory
git clone https://github.com/your-repo/ai-peer-review-mcp # Replace with the actual repo URL
cd ai-peer-review-mcp

# Create a virtual environment (recommended)
python3 -m venv venv
source venv/bin/activate  # On Windows use `venv\Scripts\activate`

# Install dependencies
pip install -r requirements.txt

# Create environment file
cp .env.example .env
# Now, edit the .env file and add your API key:
# GEMINI_API_KEY=your_actual_api_key_here

3. Review Server Files

requirements.txt:

requests
python-dotenv

server.py: (See full code in the repository)

4. Configure LMStudio or any other supported MCP Host (e.g Claude Desktop)

Add this configuration to your LMStudio MCP settings:

{
  "mcpServers": {
    "ai-peer-review": {
      "command": "python",
      "args": ["server.py"],
      "cwd": "/path/to/your/ai-peer-review-mcp",
      "env": {
        "GEMINI_API_KEY": "your_actual_api_key_here"
      }
    }
  }
}

Finding MCP Settings in LMStudio:

  • Look for: Settings โ†’ MCP Servers
  • Or: Tools & Integrations โ†’ MCP Configuration
  • Or: Program button โ†’ Edit MCP JSON

5. Test the Setup

  1. Restart LMStudio after adding the MCP configuration
  2. Start a new chat in LMStudio
  3. Ask any question: "What is quantum computing?"
  4. Request peer review: "Use the ai_peer_review tool to check and improve your answer"

๐Ÿ“š Usage Examples

Basic Usage

User: What causes climate change?

LLM: [Provides initial response about greenhouse gases...]

User: Use AI Peer Review to verify and improve that answer

LLM: [Calls ai_peer_review tool, receives feedback, provides enhanced response]

Technical Questions

User: Explain how neural networks work

LLM: [Initial technical explanation...]

User: Can you use ai_peer_review to make sure the explanation is accurate?

LLM: [Enhanced response with better technical details and examples]

Creative Tasks

User: Write a short story about AI

LLM: [Initial creative writing...]

User: Use peer review to improve the story structure and clarity

LLM: [Improved story with better narrative flow and character development]

๐Ÿ”ง Configuration Options

Environment Variables

  • GEMINI_API_KEY - Your Google Gemini API key (required)

Customization

You can modify the peer review prompt in server.py to focus on specific aspects:

review_prompt = f"""PEER REVIEW REQUEST:
# Customize this section for your specific needs
# Examples:
# - Focus on technical accuracy for coding questions
# - Emphasize creativity for writing tasks
# - Prioritize safety for medical/legal topics
...
"""

๐Ÿ“Š Monitoring and Logs

The server creates detailed logs in mcp-server.log:

# Watch logs in real-time
tail -f mcp-server.log

# View recent activity
cat mcp-server.log | tail -50

Log Information Includes:

  • Tool calls from LMStudio
  • Requests sent to Gemini
  • Raw Gemini responses
  • Parsed feedback
  • Error details

๐Ÿ› Troubleshooting

Common Issues

"Tool not available"

  • Verify MCP server configuration in LMStudio
  • Ensure your local model supports tool calling
  • Restart LMStudio after configuration changes

"GEMINI_API_KEY not found"

  • Check your .env file exists and has the correct key
  • Verify API key is valid in Google AI Studio
  • Ensure environment variable is properly set in LMStudio config

"Rate limit exceeded"

  • Google Gemini free tier has generous limits
  • Wait a moment and try again
  • Check Google AI Studio quota usage

"Model not found"

  • API model names change over time
  • Update GEMINI_API_URL in server.js if needed
  • Check Google's latest API documentation

Debug Mode

Run the server manually to see detailed output. Make sure your virtual environment is active.

export GEMINI_API_KEY=your_api_key_here
python server.py

๐Ÿ”’ Privacy and Security

  • Data sharing only on request - content is only sent to Gemini when explicitly triggered
  • No persistent storage - conversations are not stored or logged beyond current session
  • API key security - keep your Gemini API key private and secure
  • Local processing - MCP runs entirely on your machine

๐Ÿšง Limitations

  • Requires tool-calling models - basic instruction-following models won't work
  • Internet connection required - needs access to Google Gemini API
  • Rate limits - subject to Google Gemini API quotas (free tier is generous)
  • Language support - optimized for English, other languages may work but aren't tested

๐Ÿ›ฃ๏ธ Roadmap

  • [ ] Multi-provider support - Add Groq, DeepSeek, and other AI APIs
  • [ ] Smart routing - Automatic provider selection based on question type
  • [ ] Confidence thresholds - Auto-trigger peer review for uncertain responses
  • [ ] Custom review templates - Domain-specific review criteria
  • [ ] Usage analytics - Track improvement metrics and API usage
  • [ ] Batch processing - Review multiple responses at once

๐Ÿค Contributing

We welcome contributions! Here's how to help:

Development Setup

git clone https://github.com/your-repo/ai-peer-review-mcp # Replace with your repo URL
cd ai-peer-review-mcp

# Create and activate virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Set up your environment
cp .env.example .env
# --> Add your GEMINI_API_KEY to the .env file

echo "Development environment ready. Run with 'python server.py'"

Ways to Contribute

  • ๐Ÿ› Bug reports - Open issues for any problems you encounter
  • ๐Ÿ’ก Feature requests - Suggest new capabilities or improvements
  • ๐Ÿ“– Documentation - Improve setup guides, add examples
  • ๐Ÿ”ง Code contributions - Submit pull requests for fixes or features
  • ๐Ÿงช Testing - Try with different models and report compatibility
  • ๐ŸŒ Localization - Help support more languages

Contribution Guidelines

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes with clear, descriptive commits
  4. Add tests if applicable
  5. Update documentation for any new features
  6. Submit a pull request with a clear description

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Anthropic - For creating the Model Context Protocol standard
  • Google - For providing the Gemini API
  • LMStudio - For excellent MCP integration
  • Community contributors - Everyone who helps improve this project

๐Ÿ“ž Support

๐ŸŒŸ Star History

If this project helps you, please consider giving it a star on GitHub! โญ


Made with โค๏ธ for the AI community

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured