Oumi MCP Server

Oumi MCP Server

Provides access to over 500 pre-configured YAML templates and guided workflows for fine-tuning, training, and evaluating LLMs like Llama and DeepSeek. It enables AI assistants to search for recipes, retrieve configurations, and validate parameters for various machine learning tasks.

Category
Visit Server

README

Oumi MCP Server

An MCP (Model Context Protocol) server that gives AI coding assistants access to Oumi's library of ~500 ready-to-use YAML configs for fine-tuning LLMs.

When connected to Cursor, Claude Desktop, or any MCP-compatible client, the server lets the AI search for training recipes, retrieve full YAML configs, validate them, and follow guided ML engineering workflows -- all without you having to browse docs manually.

What it does

The server exposes 5 tools and 6 resources over MCP:

Tool Purpose
get_started() Overview of capabilities and quickstart guide
list_categories() Discover available model families and config types
search_configs(query, task, model, keyword) Find training configs by filters
get_config(path, include_content) Get config details and full YAML content
validate_config(config, task_type) Validate a config file before running
Resource Purpose
guidance://mle-workflow End-to-end ML engineering workflow guide
guidance://mle-train Training command usage and sizing heuristics
guidance://mle-synth Synthetic data generation guidance
guidance://mle-analyze Dataset analysis and quality checks
guidance://mle-eval Evaluation strategies and benchmarks
guidance://mle-infer Inference best practices

Supported models

Llama 3.1/3.2/4, Qwen 3, Phi 4, Gemma 3, DeepSeek R1, SmolLM, and more.

Supported training techniques

SFT, DPO, GRPO, KTO, LoRA, QLoRA, full fine-tuning, pretraining, evaluation, inference.

Installation

As part of Oumi (recommended)

pip install oumi[mcp]

Standalone

pip install oumi-mcp

From source (development)

git clone https://github.com/oumi-ai/oumi.git
cd projects/oumi-mcp
pip install -e .

Running the server

oumi-mcp

Or run as a Python module:

python -m oumi_mcp_server

Connecting to an MCP client

Cursor

Add to your Cursor MCP settings (.cursor/mcp.json):

{
  "mcpServers": {
    "oumi": {
      "command": "oumi-mcp"
    }
  }
}

Claude Desktop

Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "oumi": {
      "command": "oumi-mcp"
    }
  }
}

Any MCP client (stdio transport)

The server uses stdio transport by default. Point your MCP client to the oumi-mcp command.

How configs work

The server ships with a bundled snapshot of Oumi's ~500 YAML config files. On startup, it checks for a fresher cached copy and syncs from GitHub if the cache is stale (older than 24 hours). The resolution order is:

  1. OUMI_MCP_CONFIGS_DIR environment variable (explicit override)
  2. ~/.cache/oumi-mcp/configs (synced from GitHub, refreshed every 24h)
  3. Bundled configs shipped with the package (always-available fallback)

This means:

  • The server works immediately after install, even offline
  • Configs stay up-to-date automatically via lazy background sync
  • You can pin a specific config directory with the env var if needed

Force a sync

To manually refresh configs, delete the cache and restart:

rm -rf ~/.cache/oumi-mcp
oumi-mcp

Example workflow

Once connected, ask your AI assistant something like:

"Find me a LoRA config for fine-tuning Llama 3.1 8B on my custom dataset"

The assistant will use the MCP tools to:

  1. search_configs(model="llama3_1", query="8b_lora", task="sft") -- find matching recipes
  2. get_config("llama3_1/sft/8b_lora", include_content=True) -- retrieve the full YAML
  3. Help you customize model_name, datasets, output_dir, etc.
  4. validate_config("/path/to/your/config.yaml", "training") -- validate before running

Configuration

Environment variable Default Description
OUMI_MCP_CONFIGS_DIR (unset) Override the configs directory path

Project structure

oumi-mcp/
  src/oumi_mcp_server/
    __init__.py          # Package metadata
    __main__.py          # python -m entry point
    server.py            # MCP server, tools, resources, config sync
    config_service.py    # Config parsing, search, metadata extraction
    constants.py         # Type definitions and constants
    models.py            # TypedDict data models
    prompts/
      mle_prompt.py      # ML engineering workflow guidance resources
    configs/             # Bundled YAML configs (~500 files)
      recipes/           # Model-specific training recipes
      apis/              # API provider configs
      examples/          # Example configs
  pyproject.toml

Development

# Install in development mode
pip install -e ".[dev]"

# Run the server
oumi-mcp

# Run tests
pytest

Versioning

This package follows semantic versioning. The version is independent from the main oumi package but tracks compatibility:

  • oumi-mcp 0.x.y is compatible with oumi >= 0.6.0
  • Configs are synced from the oumi main branch and stay current regardless of package version
  • Bump the oumi-mcp version when the server code, tools, or resources change

License

Apache-2.0 -- see the main Oumi repository for details.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured