
Model Control Plane (MCP) Server
A server implementation that provides a unified interface for OpenAI services, Git repository analysis, and local filesystem operations through REST API endpoints.
dvladimirov
README
MCP Server with OpenAI, Git, Filesystem, and Prometheus Integration
This repository contains a Model Control Plane (MCP) server implementation that supports OpenAI services, Git repository analysis, local filesystem operations, and Prometheus integration.
Project Structure
MCP/
├── mcp/ # Core MCP library modules
├── scripts/ # Utility scripts and test tools
├── prometheus/ # Prometheus configuration
├── docker-compose.yml # Docker configuration
├── mcp_server.py # Main server implementation
├── mcp_run # Main runner script (shortcut)
└── README.md # This file
Requirements
- Python 3.8+
- FastAPI
- Uvicorn
- OpenAI SDK
- GitPython
- Requests
- Docker and Docker Compose (for Prometheus features)
Installation
- Clone this repository
- Install the dependencies:
pip install -r requirements.txt
Environment Variables
Set the following environment variables:
For Azure OpenAI:
export AZURE_OPENAI_ENDPOINT="your-azure-endpoint"
export AZURE_OPENAI_API_KEY="your-azure-api-key"
export AZURE_OPENAI_API_VERSION="2023-05-15"
export AZURE_DEPLOYMENT_NAME="your-deployment-name"
For Standard OpenAI:
export OPENAI_API_KEY="your-openai-api-key"
# Optional: Specify which models to use
export OPENAI_CHAT_MODEL="gpt-4o-mini" # Default if not specified
export OPENAI_COMPLETION_MODEL="gpt-3.5-turbo-instruct" # Default if not specified
For Prometheus:
export PROMETHEUS_URL="http://localhost:9090" # Default if not specified
Running the Server
Start the MCP server:
python scripts/start_mcp_server.py
Or for more options:
python scripts/start_mcp_server.py --host 0.0.0.0 --port 8000 --debug
The server will be available at http://localhost:8000.
Unified Testing Tool
We provide a unified testing script that gives you a user-friendly interface to all testing functionality:
./mcp_run
This interactive script provides:
- Filesystem tests
- Git integration tests
- Memory analysis tools
- Prometheus tests & memory stress
- MCP server management
- Environment setup
Individual Tests
You can also run individual tests directly:
Test the OpenAI integration:
python scripts/test_mcp_client.py
Test the Git integration (provide a Git repository URL):
python scripts/test_git_integration.py https://github.com/username/repository
Test the Git diff functionality (analyze requirements compatibility):
python scripts/test_git_diff.py https://github.com/username/repository [commit-sha]
Test the filesystem functionality:
python scripts/test_filesystem.py
Test the langflow integration with MCP:
python scripts/test_langflow_integration.py [OPTIONAL_REPO_URL]
Test the Prometheus integration:
python scripts/test_prometheus.py [prometheus_url]
Advanced Git Analysis
For more advanced Git repository analysis with AI recommendations:
python scripts/langflow_git_analyzer.py https://github.com/username/repository
You can also search for specific patterns in the repository:
python scripts/langflow_git_analyzer.py https://github.com/username/repository --search "def main"
Or analyze the last commit diff with AI insights:
python scripts/langflow_git_analyzer.py https://github.com/username/repository --diff
Memory Analysis Tools
MCP includes several tools for memory monitoring and analysis:
# Basic memory diagnostics with AI analysis
python scripts/ai_memory_diagnostics.py
# Interactive memory dashboard
python scripts/mcp_memory_dashboard.py
# Memory alerting system
python scripts/mcp_memory_alerting.py
You can also simulate memory pressure for testing:
python scripts/simulate_memory_pressure.py --target 85 --duration 300
Prometheus Integration
Setup
- Start the Prometheus stack using Docker Compose:
docker compose up -d
This will start:
- Prometheus server (accessible at http://localhost:9090)
- Node Exporter (for host metrics)
- cAdvisor (for container metrics)
- For stress testing, you can start the memory stress container:
docker compose up -d --build memory-stress
Or use the container test script:
./scripts/container-memory-test.sh start
Docker Configuration and Reset Scripts
This project includes multiple Docker configurations and reset scripts for reliable operation across different environments:
Docker Configurations
- Standard Configuration (
docker-compose.yml
): Uses custom Dockerfiles for Prometheus and Langflow to ensure consistent permissions across systems. - Bridge Network Configuration (
docker-compose.bridge.yml
): Alternative configuration that uses bridge networking for environments where host networking is problematic.
Custom Dockerfiles for Solving Permission Issues
The project uses custom Dockerfiles for both Prometheus and Langflow to solve common permission issues:
- Dockerfile.prometheus: Sets up the Prometheus configuration with proper permissions for the
nobody
user. - Dockerfile.langflow: Copies the components directory into the container without changing file ownership, allowing Langflow to access the components without permission errors.
This approach eliminates the need for volume mounts that can lead to permission conflicts across different machines and user configurations.
Reset Scripts
-
All Services Reset (
reset-all.sh
): Reset all containers with a single command.# Basic reset (rebuilds containers with existing volumes) ./reset-all.sh # Full reset (removes volumes and rebuilds containers) ./reset-all.sh --clean
-
Individual Service Reset:
# Reset only Prometheus ./reset-prometheus.sh # Reset only Langflow ./reset-langflow.sh
These scripts ensure that the containers are properly configured with correct permissions and the latest code changes.
Troubleshooting
If you encounter permission issues:
- Use the reset scripts to rebuild the containers
- Check the logs with
docker compose logs <service_name>
- Make sure any components added to Langflow are included in the Dockerfile.langflow
Cross-Machine Deployment
When deploying to a new machine:
- Clone the repository
- Make reset scripts executable:
chmod +x *.sh
- Run the reset script:
./reset-all.sh
The custom Dockerfiles automatically handle all permission issues that might occur across different systems.
Using Prometheus Client
The MCPAIComponent
class includes Prometheus capabilities:
from langflow import MCPAIComponent
# Initialize the client
mcp = MCPAIComponent(mcp_server_url="http://localhost:8000")
# Instant query (current metric values)
result = mcp.prometheus_query("up")
# Range query (metrics over time)
result = mcp.prometheus_query_range(
query="rate(node_cpu_seconds_total{mode='system'}[1m])",
start="2023-03-01T00:00:00Z",
end="2023-03-01T01:00:00Z",
step="15s"
)
# Get all labels
labels = mcp.prometheus_get_labels()
# Get label values
values = mcp.prometheus_get_label_values("job")
# Get targets
targets = mcp.prometheus_get_targets()
# Get alerts
alerts = mcp.prometheus_get_alerts()
Useful PromQL Queries
- CPU Usage:
rate(node_cpu_seconds_total{mode!="idle"}[1m])
- Memory Usage:
node_memory_MemTotal_bytes - node_memory_MemAvailable_bytes
- Disk Usage:
node_filesystem_avail_bytes{mountpoint="/"} / node_filesystem_size_bytes{mountpoint="/"}
- Container CPU Usage:
rate(container_cpu_usage_seconds_total[1m])
- Container Memory Usage:
container_memory_usage_bytes
API Endpoints
OpenAI Endpoints
- GET
/v1/models
- List all available models - GET
/v1/models/{model_id}
- Get information about a specific model - POST
/v1/models/azure-gpt-4/completion
- Generate text completion using Azure OpenAI - POST
/v1/models/azure-gpt-4/chat
- Generate chat response using Azure OpenAI - POST
/v1/models/openai-gpt-chat/chat
- Generate chat response using OpenAI chat model - POST
/v1/models/openai-gpt-completion/completion
- Generate text completion using OpenAI completion model
Git Integration Endpoints
- POST
/v1/models/git-analyzer/analyze
- Analyze a Git repository - POST
/v1/models/git-analyzer/search
- Search a Git repository for files matching a pattern - POST
/v1/models/git-analyzer/diff
- Get the diff of the last commit in a repository
Filesystem Endpoints
- POST
/v1/models/filesystem/list
- List contents of a directory - POST
/v1/models/filesystem/read
- Read a file's contents - POST
/v1/models/filesystem/read-multiple
- Read multiple files at once - POST
/v1/models/filesystem/write
- Write content to a file - POST
/v1/models/filesystem/edit
- Edit a file with multiple replacements - POST
/v1/models/filesystem/mkdir
- Create a directory - POST
/v1/models/filesystem/move
- Move a file or directory - POST
/v1/models/filesystem/search
- Search for files matching a pattern - POST
/v1/models/filesystem/info
- Get information about a file or directory
Prometheus Endpoints
- POST
/v1/models/prometheus/query
- Execute an instant query - POST
/v1/models/prometheus/query_range
- Execute a range query - POST
/v1/models/prometheus/series
- Get series data - GET
/v1/models/prometheus/labels
- Get all available labels - POST
/v1/models/prometheus/label_values
- Get values for a specific label - GET
/v1/models/prometheus/targets
- Get all targets - GET
/v1/models/prometheus/rules
- Get all rules - GET
/v1/models/prometheus/alerts
- Get all alerts
Client Usage
You can use the MCPAIComponent
in your LangFlow pipelines by providing the MCP server URL:
from langflow import MCPAIComponent
mcp = MCPAIComponent(mcp_server_url="http://localhost:8000")
# List available models
models = mcp.list_models()
print(models)
# Generate chat completion with OpenAI model
chat_response = mcp.chat(
model_id="openai-gpt-chat",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Tell me a joke about programming."}
],
max_tokens=100,
temperature=0.7
)
print(chat_response)
# Generate text completion with OpenAI model
completion_response = mcp.completion(
model_id="openai-gpt-completion",
prompt="Write a function in Python to calculate the factorial of a number:",
max_tokens=150,
temperature=0.7
)
print(completion_response)
# Analyze a Git repository
repo_analysis = mcp.analyze_git_repo("https://github.com/username/repository")
print(repo_analysis)
# Search a Git repository
search_results = mcp.search_git_repo("https://github.com/username/repository", "def main")
print(search_results)
# Get the diff of the last commit
diff_info = mcp.get_git_diff("https://github.com/username/repository")
print(diff_info)
# List files in the current directory
dir_contents = mcp.list_directory()
print(dir_contents)
# Read a file
file_content = mcp.read_file("path/to/file.txt")
print(file_content)
# Write to a file
write_result = mcp.write_file("path/to/new_file.txt", "Hello, world!")
print(write_result)
# Search for files
search_result = mcp.search_files("*.py")
print(search_result)
Using the GitCodeAnalyzer Class
For more structured Git analysis, you can use the GitCodeAnalyzer
class:
from langflow_git_analyzer import GitCodeAnalyzer
# Initialize the analyzer
analyzer = GitCodeAnalyzer(mcp_server_url="http://localhost:8000")
# Analyze a repository
analyzer.analyze_repository("https://github.com/username/repository")
# Get a summary
summary = analyzer.get_repository_summary()
print(summary)
# Get AI recommendations
recommendations = analyzer.get_repository_recommendations()
print(recommendations)
# Analyze code patterns
pattern_analysis = analyzer.analyze_code_pattern("def process")
print(pattern_analysis)
# Get the last commit diff
diff_info = analyzer.get_last_commit_diff()
print(diff_info)
# Get a formatted summary of the diff
diff_summary = analyzer.get_formatted_diff_summary()
print(diff_summary)
# Get AI analysis of the commit changes
diff_analysis = analyzer.analyze_commit_diff()
print(diff_analysis)
Troubleshooting
Prometheus Issues
- Verify Prometheus is running:
docker ps | grep prometheus
- Check you can access the Prometheus UI: http://localhost:9090
- Verify the MCP server is running and accessible
- Check the MCP server logs for errors
- Try simple queries first to verify connectivity (e.g.,
up
query)
OpenAI Issues
- Verify your API keys are set correctly
- Check for rate limiting or quota issues
- Verify you're using supported models for your API key
Git Issues
- Ensure the Git repository URL is accessible
- Check for authentication issues if using private repositories
- Ensure GitPython is installed correctly
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Excel MCP Server
A Model Context Protocol server that enables AI assistants to read from and write to Microsoft Excel files, supporting formats like xlsx, xlsm, xltx, and xltm.
Playwright MCP Server
Provides a server utilizing Model Context Protocol to enable human-like browser automation with Playwright, allowing control over browser actions such as navigation, element interaction, and scrolling.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Apple MCP Server
Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.
Gitingest-MCP
An MCP server for gitingest. It allows MCP clients like Claude Desktop, Cursor, Cline etc to quickly extract information about Github repositories including repository summaries, project directory structure, file contents, etc