
Prompt-Optimizer-MCP-for-LLMs
A Model Context Protocol (MCP) server that provides intelligent tools for optimizing and scoring LLM prompts using deterministic heuristics.
README
🚀 Prompt Optimizer MCP
A Model Context Protocol (MCP) server that provides intelligent tools for optimizing and scoring LLM prompts using deterministic heuristics.
🎯 Overview
The Prompt Optimizer MCP server offers two powerful tools:
optimize_prompt
- Generate 3 optimized variants of a raw LLM prompt in different stylesscore_prompt
- Evaluate the effectiveness of an improved prompt relative to the original
Perfect for developers, content creators, and AI practitioners who want to improve their prompt engineering workflow.
✨ Features
🎨 Prompt Optimization Styles
- Creative: Enhanced with descriptive adjectives and engaging language
- Precise: Concise and focused, removing redundant words
- Fast: Optimized for quick processing with shorter synonyms
📊 Intelligent Scoring Algorithm
The scoring system evaluates prompts based on:
- Length optimization (40%): Prefers shorter, more concise prompts
- Keyword preservation (30%): Maintains important terms from the original
- Clarity improvement (30%): Reduces redundancy and improves structure
🔧 Technical Features
- ✅ Stateless: No external dependencies or state management
- ✅ Deterministic: Same inputs always produce same outputs
- ✅ Error-free: Comprehensive input validation and error handling
- ✅ Fast: Simple heuristics for quick processing
- ✅ Extensible: Easy to add new styles and scoring metrics
- ✅ Dual Transport: Supports both STDIO (MCP) and HTTP (deployment)
📁 Project Structure
prompt-optimizer-mcp/
├── 📄 README.md # This file
├── 📄 server.py # Main MCP server (STDIO transport)
├── 📄 http_server.py # HTTP server for deployment
├── 📄 start.py # Startup script (auto-detects mode)
├── 📄 requirements.txt # Python dependencies
├── 📄 test_server.py # Test script
├── 📄 deploy.py # Deployment script
├── 📄 Dockerfile # Container configuration
├── 📄 .gitignore # Git ignore rules
├── 📁 tools/
│ ├── 📄 __init__.py # Package initialization
│ └── 📄 optimize.py # Core optimization logic
├── 📁 tests/
│ ├── 📄 __init__.py # Test package initialization
│ └── 📄 test_optimize.py # Unit tests
└── 📁 .github/
└── 📁 workflows/
└── 📄 ci.yml # CI/CD pipeline
🚀 Quick Start
1. Clone the Repository
git clone https://github.com/Mahad-007/Prompt-Optimizer-MCP-for-LLMs.git
cd Prompt-Optimizer-MCP-for-LLMs
2. Install Dependencies
pip install -r requirements.txt
3. Run Tests
python test_server.py
4. Start the Server
# For local development (STDIO mode)
python server.py
# For deployment (HTTP mode)
python start.py
🛠️ Installation
Prerequisites
- Python 3.11 or higher
- pip package manager
Install Dependencies
# Install from requirements.txt
pip install -r requirements.txt
⚙️ Configuration
For Cursor IDE
Create .cursor/mcp.json
:
{
"mcpServers": {
"prompt-optimizer": {
"command": "python",
"args": ["server.py"],
"env": {}
}
}
}
For Other MCP Clients
Configure your MCP client to use:
- Command:
python server.py
- Transport: STDIO (default)
📖 Usage Examples
Using the MCP Server
Once configured, you can use the tools through any MCP client:
Optimize a Prompt
# Generate creative variants
variants = optimize_prompt(
raw_prompt="Write a story about a cat",
style="creative"
)
# Returns: [
# "Craft a compelling story about a cat",
# "Imagine you're an expert in this field. Write a story about a cat",
# "Write a story about a cat. in a way that captivates and inspires"
# ]
# Generate precise variants
variants = optimize_prompt(
raw_prompt="Please write a very detailed explanation about machine learning",
style="precise"
)
# Returns: [
# "Write a detailed explanation about machine learning",
# "• Write a detailed explanation about machine learning",
# "Write a detailed explanation about machine learning Be specific and concise."
# ]
Score a Prompt
score = score_prompt(
raw_prompt="Please write a very detailed explanation about machine learning",
improved_prompt="Write an explanation about machine learning"
)
# Returns: 0.85 (high score due to length reduction and clarity improvement)
HTTP API Usage
When deployed, the server also provides HTTP endpoints:
# Health check
curl http://localhost:8000/health
# Optimize prompt
curl -X POST http://localhost:8000/optimize \
-H "Content-Type: application/json" \
-d '{"raw_prompt": "Write about AI", "style": "creative"}'
# Score prompt
curl -X POST http://localhost:8000/score \
-H "Content-Type: application/json" \
-d '{"raw_prompt": "Write about AI", "improved_prompt": "Write about artificial intelligence"}'
Direct Python Usage
from tools.optimize import optimize_prompt, score_prompt
# Optimize a prompt
variants = optimize_prompt("Write about AI", "creative")
print(f"Optimized variants: {variants}")
# Score a prompt
score = score_prompt("Write about AI", "Write about artificial intelligence")
print(f"Score: {score}")
🧪 Testing
Run the comprehensive test suite:
# Run all tests
python test_server.py
# Run unit tests
python -m unittest tests.test_optimize -v
# Run specific test classes
python -m unittest tests.test_optimize.TestOptimizePrompt
python -m unittest tests.test_optimize.TestScorePrompt
python -m unittest tests.test_optimize.TestIntegration
🚀 Deployment
Automated Deployment
Use the deployment script:
python deploy.py
This will:
- Run all tests
- Install dependencies
- Run linting checks
- Build Docker image (if available)
- Create deployment package
Manual Deployment
Deploy to Smithery
-
Install Smithery CLI:
npm install -g @smithery/cli
-
Authenticate:
smithery auth login
-
Deploy:
# Windows .\deploy.bat # Linux/macOS chmod +x deploy.sh ./deploy.sh
Deploy with Docker
# Build the image
docker build -t prompt-optimizer-mcp:latest .
# Run the container
docker run -p 8000:8000 prompt-optimizer-mcp:latest
Deploy to Other Platforms
The server supports both STDIO (for MCP clients) and HTTP (for web deployment) transports:
- STDIO Mode:
python server.py
(for MCP clients) - HTTP Mode:
python start.py
(for web deployment)
Your MCP server will be available at: https://prompt-optimizer-mcp.smithery.ai
For detailed deployment instructions, see DEPLOYMENT.md.
🔧 Development
Adding New Optimization Styles
- Add the new style to the
Literal
type inserver.py
- Implement the style function in
tools/optimize.py
- Add corresponding tests in
tests/test_optimize.py
Extending the Scoring Algorithm
Modify the score_prompt
function in tools/optimize.py
to include additional metrics or adjust weights.
Running Locally
# Start the MCP server (STDIO mode)
python server.py
# Start the HTTP server (deployment mode)
python http_server.py
# Auto-detect mode based on environment
python start.py
📊 Performance
- Response Time: < 100ms for most operations
- Memory Usage: ~50MB typical
- CPU Usage: Minimal (stateless operations)
- Scalability: Auto-scales from 1-5 replicas on Smithery
🤝 Contributing
We welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Development Setup
# Clone your fork
git clone https://github.com/yourusername/Prompt-Optimizer-MCP-for-LLMs.git
cd Prompt-Optimizer-MCP-for-LLMs
# Install dependencies
pip install -r requirements.txt
# Run tests
python test_server.py
# Make your changes and test
python demo.py
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Model Context Protocol for the MCP specification
- MCP Python SDK for the server framework
- Smithery for deployment platform
📞 Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: DEPLOYMENT.md
⭐ Star History
Made with ❤️ for the AI community
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.