HistGradientBoostingClassifier MCP Server
Provides tools for training, managing, and making predictions with scikit-learn's HistGradientBoostingClassifier models. It allows users to handle model lifecycles, including creation, evaluation, and serialization, via the Model Context Protocol.
README
HistGradientBoostingClassifier MCP Server
A Model Context Protocol (MCP) server that provides tools for training, predicting, and managing sklearn's HistGradientBoostingClassifier models.
Features
This MCP server exposes the following tools:
- create_classifier: Create a new HistGradientBoostingClassifier with custom parameters
- train_model: Train a classifier on provided data
- predict: Make class predictions on new data
- predict_proba: Get class probabilities for predictions
- score_model: Evaluate model accuracy on test data
- get_model_info: Get detailed information about a model
- list_models: List all available models
- delete_model: Remove a model from memory
- save_model: Serialize a model to base64 string
- load_model: Load a model from serialized string
Installation
pip install -r requirements.txt
Local Development
Run the server locally:
uvrun --with mcp server.py
The server will start on http://localhost:8000 by default.
Railway Deployment
Prerequisites
- A Railway account (sign up at https://railway.app)
- Railway CLI installed (optional, can use web interface)
- Git repository with your code (GitHub, GitLab, or Bitbucket)
Deploy via Railway Web Interface
- Go to https://railway.app and create a new project
- Click "New Project" → "Deploy from GitHub repo"
- Select your repository containing this MCP server
- Railway will automatically detect the Python project and use the
Procfile - The server will be deployed and you'll get a public URL (e.g.,
https://your-app.railway.app)
Deploy via Railway CLI
# Install Railway CLI
npm i -g @railway/cli
# Login to Railway
railway login
# Initialize project (in your project directory)
railway init
# Link to existing project or create new one
railway link
# Deploy
railway up
Environment Variables
No environment variables are required for basic operation. Railway automatically provides:
PORT: The port your application should listen on- The server automatically binds to
0.0.0.0to accept external connections
Verifying Deployment
Once deployed, your MCP server will be available at your Railway URL. You can test it by:
- Visiting
https://your-app.railway.appin a browser (should show MCP server info or 404, which is normal) - Using the MCP Inspector:
npx -y @modelcontextprotocol/inspectorand connecting to your Railway URL - Connecting from an MCP client using the streamable-http transport
Current Deployment URL: https://web-production-a620a.up.railway.app
Usage
Once deployed, the MCP server will be accessible at your Railway URL. You can connect to it using any MCP-compatible client.
Example: Using with Claude Desktop
Add to your Claude Desktop MCP configuration (~/Library/Application Support/Claude/claude_desktop_config.json on Mac):
{
"mcpServers": {
"histgradientboosting": {
"url": "https://web-production-a620a.up.railway.app",
"transport": "streamable-http"
}
}
}
Example API Calls
The server exposes tools that can be called via MCP protocol. Here's what each tool does:
Create a classifier:
create_classifier(
model_id="my_model",
learning_rate=0.1,
max_iter=100,
max_leaf_nodes=31
)
Train the model:
train_model(
model_id="my_model",
X=[[1, 2], [3, 4], [5, 6]],
y=[0, 1, 0]
)
Make predictions:
predict(
model_id="my_model",
X=[[2, 3], [4, 5]]
)
Get probabilities:
predict_proba(
model_id="my_model",
X=[[2, 3], [4, 5]]
)
Model Storage
Currently, models are stored in-memory. This means:
- Models persist only during the server's lifetime
- Restarting the server will lose all models
- For production use, consider implementing persistent storage (database, file system, or cloud storage)
API Reference
HistGradientBoostingClassifier Parameters
All standard sklearn HistGradientBoostingClassifier parameters are supported:
loss: Loss function (default: 'log_loss')learning_rate: Learning rate/shrinkage (default: 0.1)max_iter: Maximum boosting iterations (default: 100)max_leaf_nodes: Maximum leaves per tree (default: 31)max_depth: Maximum tree depth (default: None)min_samples_leaf: Minimum samples per leaf (default: 20)l2_regularization: L2 regularization (default: 0.0)max_features: Feature subsampling proportion (default: 1.0)max_bins: Maximum histogram bins (default: 255)early_stopping: Enable early stopping (default: 'auto')validation_fraction: Validation set fraction (default: 0.1)n_iter_no_change: Early stopping patience (default: 10)random_state: Random seed (default: None)verbose: Verbosity level (default: 0)
See the sklearn documentation for detailed parameter descriptions.
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.