
mcp-optimizer
mcp-optimizer
README
MCP Optimizer
🚀 Mathematical Optimization MCP Server with PuLP and OR-Tools support
🚀 Quick Start
Integration with LLM Clients
Claude Desktop Integration
Option 1: Using uvx (Recommended)
- Install Claude Desktop from claude.ai
- Open Claude Desktop → Settings → Developer → Edit Config
- Add to your
claude_desktop_config.json
:
{
"mcpServers": {
"mcp-optimizer": {
"command": "uvx",
"args": ["mcp-optimizer"]
}
}
}
- Restart Claude Desktop and look for the 🔨 tools icon
Option 2: Using pip
pip install mcp-optimizer
Then add to your Claude Desktop config:
{
"mcpServers": {
"mcp-optimizer": {
"command": "mcp-optimizer"
}
}
}
Option 3: Using Docker
Method A: Docker with STDIO transport (Recommended for MCP clients)
docker pull ghcr.io/dmitryanchikov/mcp-optimizer:latest
Then add to your Claude Desktop config:
{
"mcpServers": {
"mcp-optimizer": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"ghcr.io/dmitryanchikov/mcp-optimizer:latest",
"python", "main.py"
]
}
}
}
Method B: Docker with SSE transport (for HTTP/web clients)
# Run SSE server on port 8000
docker run -d -p 8000:8000 ghcr.io/dmitryanchikov/mcp-optimizer:latest \
python -m mcp_optimizer.main --transport sse --host 0.0.0.0
# Or with custom port
docker run -d -p 9000:9000 ghcr.io/dmitryanchikov/mcp-optimizer:latest \
python -m mcp_optimizer.main --transport sse --host 0.0.0.0 --port 9000
Then use HTTP client to connect to http://localhost:8000
(requires additional MCP HTTP client setup)
Cursor Integration
- Install the MCP extension in Cursor
- Add mcp-optimizer to your workspace settings:
{
"mcp.servers": {
"mcp-optimizer": {
"command": "uvx",
"args": ["mcp-optimizer"]
}
}
}
Other LLM Clients
For other MCP-compatible clients (Continue, Cody, etc.), use similar configuration patterns with the appropriate command for your installation method.
Advanced Installation Options
Local Development
# Clone the repository
git clone https://github.com/dmitryanchikov/mcp-optimizer.git
cd mcp-optimizer
# Install dependencies with uv
uv sync --extra dev
# Run the server
uv run python main.py
Local Package Build and Run
For testing and development, you can build the package locally and run it with uvx:
# Build the package locally
uv build
# Run with uvx from local wheel file
uvx --from ./dist/mcp_optimizer-0.3.9-py3-none-any.whl mcp-optimizer
# Or run with help to see available options
uvx --from ./dist/mcp_optimizer-0.3.9-py3-none-any.whl mcp-optimizer --help
# Test the local package with a simple MCP message
echo '{"jsonrpc": "2.0", "method": "initialize", "params": {"protocolVersion": "2024-11-05", "capabilities": {}, "clientInfo": {"name": "test", "version": "1.0"}}, "id": 1}' | uvx --from ./dist/mcp_optimizer-0.3.9-py3-none-any.whl mcp-optimizer
Note: The local build creates both wheel (.whl
) and source distribution (.tar.gz
) files in the dist/
directory. The wheel file is recommended for uvx installation as it's faster and doesn't require compilation.
Troubleshooting: If you encounter event loop issues when using uvx, the package includes automatic detection and handling of existing event loops using nest-asyncio
.
Docker with Custom Configuration
# Build locally with optimization
git clone https://github.com/dmitryanchikov/mcp-optimizer.git
cd mcp-optimizer
docker build -t mcp-optimizer:optimized .
docker run -p 8000:8000 mcp-optimizer:optimized
# Check optimized image size (398MB vs 1.03GB original - 61% reduction!)
docker images mcp-optimizer:optimized
# Test the optimized image
./scripts/test_docker_optimization.sh
Standalone Server Commands
# Run directly with uvx (no installation needed)
uvx mcp-optimizer
# Or run specific commands
uvx mcp-optimizer --help
# With pip installation
mcp-optimizer
# Or run with Python module (use main.py for stdio mode)
python main.py
Transport Modes
MCP Optimizer supports two transport protocols:
- STDIO: Standard input/output for direct MCP client integration (Claude Desktop, Cursor, etc.)
- SSE: Server-Sent Events over HTTP for web-based clients and custom integrations
STDIO Transport (Default - for MCP clients like Claude Desktop)
# Default STDIO mode for MCP protocol
uvx mcp-optimizer
# or
uvx mcp-optimizer --transport stdio
# or
uv run python -m mcp_optimizer.main --transport stdio
# or
python main.py
SSE Transport (for HTTP/web clients)
# SSE mode for HTTP clients (default port 8000)
uvx mcp-optimizer --transport sse
# or
uv run python -m mcp_optimizer.main --transport sse
# Custom host and port
uvx mcp-optimizer --transport sse --host 0.0.0.0 --port 9000
# or
uv run python -m mcp_optimizer.main --transport sse --host 0.0.0.0 --port 9000
# With debug mode
uvx mcp-optimizer --transport sse --debug --log-level DEBUG
Available CLI Options
# Show all available options
uvx mcp-optimizer --help
# Options:
# --transport {stdio,sse} MCP transport protocol (default: stdio)
# --port PORT Port for SSE transport (default: 8000)
# --host HOST Host for SSE transport (default: 127.0.0.1)
# --debug Enable debug mode
# --reload Enable auto-reload for development
# --log-level {DEBUG,INFO,WARNING,ERROR} Logging level (default: INFO)
🎯 Features
Supported Optimization Problem Types:
- Linear Programming - Maximize/minimize linear objective functions
- Assignment Problems - Optimal resource allocation using Hungarian algorithm
- Transportation Problems - Logistics and supply chain optimization
- Knapsack Problems - Optimal item selection (0-1, bounded, unbounded)
- Routing Problems - TSP and VRP with time windows
- Scheduling Problems - Job and shift scheduling
- Integer Programming - Discrete optimization problems
- Financial Optimization - Portfolio optimization and risk management
- Production Planning - Multi-period production planning
Testing
# Run simple functionality tests
uv run python tests/test_integration/comprehensive_test.py
# Run comprehensive integration tests
uv run python tests/test_integration/comprehensive_test.py
# Run all unit tests
uv run pytest tests/ -v
# Run with coverage
uv run pytest tests/ --cov=src/mcp_optimizer --cov-report=html
📊 Usage Examples
Linear Programming
from mcp_optimizer.tools.linear_programming import solve_linear_program
# Maximize 3x + 2y subject to:
# x + y <= 4
# 2x + y <= 6
# x, y >= 0
objective = {"sense": "maximize", "coefficients": {"x": 3, "y": 2}}
variables = {
"x": {"type": "continuous", "lower": 0},
"y": {"type": "continuous", "lower": 0}
}
constraints = [
{"expression": {"x": 1, "y": 1}, "operator": "<=", "rhs": 4},
{"expression": {"x": 2, "y": 1}, "operator": "<=", "rhs": 6}
]
result = solve_linear_program(objective, variables, constraints)
# Result: x=2.0, y=2.0, objective=10.0
Assignment Problem
from mcp_optimizer.tools.assignment import solve_assignment_problem
workers = ["Alice", "Bob", "Charlie"]
tasks = ["Task1", "Task2", "Task3"]
costs = [
[4, 1, 3], # Alice's costs for each task
[2, 0, 5], # Bob's costs for each task
[3, 2, 2] # Charlie's costs for each task
]
result = solve_assignment_problem(workers, tasks, costs)
# Result: Total cost = 5.0 with optimal assignments
Knapsack Problem
from mcp_optimizer.tools.knapsack import solve_knapsack_problem
items = [
{"name": "Item1", "weight": 10, "value": 60},
{"name": "Item2", "weight": 20, "value": 100},
{"name": "Item3", "weight": 30, "value": 120}
]
result = solve_knapsack_problem(items, capacity=50)
# Result: Total value = 220.0 with optimal item selection
Portfolio Optimization
from mcp_optimizer.tools.financial import optimize_portfolio
assets = [
{"name": "Stock A", "expected_return": 0.12, "risk": 0.18},
{"name": "Stock B", "expected_return": 0.10, "risk": 0.15},
{"name": "Bond C", "expected_return": 0.06, "risk": 0.08}
]
result = optimize_portfolio(
assets=assets,
objective="minimize_risk",
budget=10000,
risk_tolerance=0.15
)
# Result: Optimal portfolio allocation with minimized risk
🏗️ Architecture
mcp-optimizer/
├── src/mcp_optimizer/
│ ├── tools/ # 9 categories of optimization tools
│ │ ├── linear_programming.py
│ │ ├── assignment.py
│ │ ├── knapsack.py
│ │ ├── routing.py
│ │ ├── scheduling.py
│ │ ├── financial.py
│ │ └── production.py
│ ├── solvers/ # PuLP and OR-Tools integration
│ │ ├── pulp_solver.py
│ │ └── ortools_solver.py
│ ├── schemas/ # Pydantic validation schemas
│ ├── utils/ # Utility functions
│ ├── config.py # Configuration
│ └── mcp_server.py # Main MCP server
├── tests/ # Comprehensive test suite
├── docs/ # Documentation
├── k8s/ # Kubernetes deployment
├── monitoring/ # Grafana/Prometheus setup
└── main.py # Entry point
🧪 Test Results
✅ Comprehensive Test Suite
🧪 Starting Comprehensive MCP Optimizer Tests
==================================================
✅ Server Health PASSED
✅ Linear Programming PASSED
✅ Assignment Problems PASSED
✅ Knapsack Problems PASSED
✅ Routing Problems PASSED
✅ Scheduling Problems PASSED
✅ Financial Optimization PASSED
✅ Production Planning PASSED
✅ Performance Test PASSED
📊 Test Results: 9 passed, 0 failed
🎉 All tests passed! MCP Optimizer is ready for production!
✅ Unit Tests
- 66 tests passed, 9 skipped
- Execution time: 0.45 seconds
- All core components functional
📈 Performance Metrics
- Linear Programming: ~0.01s
- Assignment Problems: ~0.01s
- Knapsack Problems: ~0.01s
- Complex test suite: 0.02s for 3 optimization problems
- Overall performance: 🚀 Excellent!
🔧 Technical Details
Core Solvers
- OR-Tools: For assignment, transportation, knapsack problems
- PuLP: For linear/integer programming
- FastMCP: For MCP server integration
Supported Solvers
- CBC, GLPK, GUROBI, CPLEX (via PuLP)
- SCIP, CP-SAT (via OR-Tools)
Key Features
- ✅ Full MCP protocol integration
- ✅ Comprehensive input validation
- ✅ Robust error handling
- ✅ High-performance optimization
- ✅ Production-ready architecture
- ✅ Extensive test coverage
- ✅ Docker and Kubernetes support
📋 Requirements
- Python 3.11+
- uv (for dependency management)
- OR-Tools (automatically installed)
- PuLP (automatically installed)
🚀 Production Deployment
Docker
# Build image
docker build -t mcp-optimizer .
# Run container
docker run -p 8000:8000 mcp-optimizer
Kubernetes
# Deploy to Kubernetes
kubectl apply -f k8s/
Monitoring
# Start monitoring stack
docker-compose up -d
🎯 Project Status
✅ PRODUCTION READY 🚀
- All core optimization tools implemented and tested
- MCP server fully functional
- Comprehensive test coverage (66 unit tests + 9 integration tests)
- OR-Tools integration confirmed working
- Performance optimized (< 30s for complex test suites)
- Ready for production deployment
📖 Usage Examples
The examples/
directory contains practical examples and prompts for using MCP Optimizer with Large Language Models (LLMs):
Available Examples
- 📊 Linear Programming (RU | EN)
- Production optimization, diet planning, transportation, blending problems
- 👥 Assignment Problems (RU | EN)
- Employee-project assignment, machine-order allocation, task distribution
- 💰 Portfolio Optimization (RU | EN)
- Investment portfolios, retirement planning, risk management
How to Use Examples
- For LLM Integration: Copy the prompt text and provide it to your LLM with MCP Optimizer access
- For Direct API Usage: Use the provided API structures directly with MCP Optimizer functions
- For Learning: Understand different optimization problem types and formulations
Each example includes:
- Problem descriptions and real-world scenarios
- Ready-to-use prompts for LLMs
- Technical API structures
- Common activation phrases
- Practical applications
🔄 Recent Updates
Latest Release Features:
-
Function Exports - Added exportable functions to all tool modules:
solve_linear_program()
in linear_programming.pysolve_assignment_problem()
in assignment.pysolve_knapsack_problem()
in knapsack.pyoptimize_portfolio()
in financial.pyoptimize_production()
in production.py
-
Enhanced Testing - Updated comprehensive test suite with correct function signatures
-
OR-Tools Integration - Confirmed full functionality of all OR-Tools components
🚀 Fully Automated Release Process
New Simplified Git Flow (3 steps!)
The project uses a fully automated release process:
1. Create Release Branch
# For minor release (auto-increment)
uv run python scripts/release.py --type minor
# For specific version
uv run python scripts/release.py 0.2.0
# For hotfix
uv run python scripts/release.py --hotfix --type patch
# Preview changes
uv run python scripts/release.py --type minor --dry-run
2. Create PR to main
# Create PR: release/v0.3.0 → main
gh pr create --base main --head release/v0.3.0 --title "Release v0.3.0"
3. Merge PR - DONE! 🎉
After PR merge, automatically happens:
- ✅ Create tag v0.3.0
- ✅ Publish to PyPI
- ✅ Publish Docker images
- ✅ Create GitHub Release
- ✅ Merge main back to develop
- ✅ Cleanup release branch
NO NEED to run finalize_release.py
manually anymore!
🔒 Secure Detection: Uses hybrid approach combining GitHub branch protection with automated release detection. See Release Process for details.
Automated Release Pipeline
The CI/CD pipeline automatically handles:
- ✅ Release Candidates: Built from
release/*
branches - ✅ Production Releases: Triggered by version tags on
main
- ✅ PyPI Publishing: Automatic on tag creation
- ✅ Docker Images: Multi-architecture builds
- ✅ GitHub Releases: With artifacts and release notes
CI/CD Pipeline
The GitHub Actions workflow automatically:
- ✅ Runs tests on Python 3.11 and 3.12
- ✅ Performs security scanning
- ✅ Builds and pushes Docker images
- ✅ Publishes to PyPI on tag creation
- ✅ Creates GitHub releases
Requirements for PyPI Publication
- Set
PYPI_API_TOKEN
secret in GitHub repository - Ensure all tests pass
- Follow semantic versioning
🛠️ Development Tools
Debug Tools
Use the debug script to inspect MCP server structure:
# Run debug tools to check server structure
uv run python scripts/debug_tools.py
# This will show:
# - Available MCP tools
# - Tool types and attributes
# - Server configuration
Comprehensive Testing
Run the full integration test suite:
# Run comprehensive tests
uv run python tests/test_integration/comprehensive_test.py
# This tests:
# - All optimization tools (9 categories)
# - Server health and functionality
# - Performance benchmarks
# - End-to-end workflows
Docker Build Instructions
Image Details
- Base: Python 3.12 Slim (Debian-based)
- Size: ~649MB (optimized with multi-stage builds)
- Architecture: Multi-platform support (x86_64, ARM64)
- Security: Non-root user, minimal dependencies
- Performance: Optimized Python bytecode, cleaned build artifacts
Local Build Commands
# Standard build
docker build -t mcp-optimizer:latest .
# Build with development dependencies
docker build --build-arg ENV=development -t mcp-optimizer:dev .
# Build with cache mount for faster rebuilds
docker build --mount=type=cache,target=/build/.uv -t mcp-optimizer .
# Check image size
docker images mcp-optimizer
# Run container
docker run -p 8000:8000 mcp-optimizer:latest
# For development with volume mounting
docker run -p 8000:8000 -v $(pwd):/app mcp-optimizer:latest
# Test container functionality
docker run --rm mcp-optimizer:latest python -c "from mcp_optimizer.mcp_server import create_mcp_server; print('✅ MCP Optimizer works!')"
🤝 Contributing
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Git Flow Policy
This project follows a standard Git Flow workflow:
- Feature branches →
develop
branch - Release branches →
main
branch - Hotfix branches →
main
anddevelop
branches
📚 Documentation:
- Contributing Guide - Complete development workflow and Git Flow policy
- Release Process - How releases are created and automated
- Repository Setup - Complete setup guide including branch protection and security configuration
Development Setup
# Clone and setup
git clone https://github.com/dmitryanchikov/mcp-optimizer.git
cd mcp-optimizer
# Create feature branch from develop
git checkout develop
git checkout -b feature/your-feature-name
# Install dependencies
uv sync --extra dev
# Run tests
uv run pytest tests/ -v
# Run linting
uv run ruff check src/
uv run mypy src/
# Create PR to develop branch (not main!)
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- OR-Tools - Google's optimization tools
- PuLP - Linear programming in Python
- FastMCP - Fast MCP server implementation
📞 Support
- 📧 Email: support@mcp-optimizer.com
- 🐛 Issues: GitHub Issues
- 📖 Documentation: docs/
Made with ❤️ for the optimization community
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.