Oxide

Oxide

Intelligent LLM orchestrator that automatically routes tasks to the most appropriate AI model (Gemini, Qwen, Ollama, LM Studio) based on task characteristics, enabling distributed processing and parallel execution across local and network services.

Category
Visit Server

README

Oxide - Intelligent LLM Orchestrator

Oxide is an intelligent orchestration system that allows Claude Code to automatically route tasks to the most appropriate LLM based on task characteristics, enabling distributed AI resource utilization across local and network services.

Features

  • Automatic Task Routing: Intelligently classifies tasks and routes them to the optimal LLM
  • Parallel Execution: Distributes large codebase analysis across multiple LLMs simultaneously
  • MCP Integration: Native integration with Claude Code via Model Context Protocol
  • Multi-Service Support: Works with Gemini CLI, Qwen CLI, Ollama (local & remote), and LM Studio
  • Web Dashboard: Real-time monitoring and configuration UI (coming soon)

Architecture

Claude Code (MCP) � Oxide Orchestrator � [Gemini | Qwen | Ollama | LM Studio]

Supported Services

Local Services:

  • Gemini CLI: Large context window (2M tokens) - ideal for codebase analysis
  • Qwen CLI: Code-specialized - best for code review and generation
  • Ollama: Local inference - fast, low-latency queries

Network Services (LAN):

  • LM Studio: OpenAI-compatible API on laptop
  • Ollama Remote: Distributed processing on server

Installation

# Clone the repository
cd /Users/yayoboy/Documents/GitHub/oxide

# Install dependencies
uv sync

# Verify installation
uv run oxide-mcp --help

Configuration

Configure services in config/default.yaml:

services:
  gemini:
    type: cli
    executable: gemini
    enabled: true

  qwen:
    type: cli
    executable: qwen
    enabled: true

  ollama_local:
    type: http
    base_url: "http://localhost:11434"
    enabled: true
    default_model: "qwen2.5-coder:7b"

Integration with Claude Code

Add to ~/.claude/settings.json:

{
  "mcpServers": {
    "oxide": {
      "command": "uv",
      "args": ["--directory", "/Users/yayoboy/Documents/GitHub/oxide", "run", "oxide-mcp"],
      "env": {
        "OXIDE_AUTO_START_WEB": "true"
      }
    }
  }
}

Note: Setting OXIDE_AUTO_START_WEB=true automatically starts the Web UI when the MCP server launches!

Quick Start

Launch All Services

Multiple ways to start Oxide:

# Option 1: Unified launcher (MCP + Web UI)
uv run oxide-all

# Option 2: Auto-start Web UI with MCP (set OXIDE_AUTO_START_WEB=true in settings.json)
uv run oxide-mcp

# Option 3: Shell script
./scripts/start_all.sh

# Option 4: Separate services
uv run oxide-mcp    # MCP server only
uv run oxide-web    # Web UI only

See AUTO_START_GUIDE.md for detailed auto-start configuration.

Usage

Once integrated with Claude, use the MCP tools:

# Intelligent task routing
Use oxide route_task to analyze this code for bugs

# Parallel codebase analysis
Use oxide analyze_parallel to analyze the ./src directory

# Check service status
Use oxide list_services to show available LLMs

Task Classification

Oxide automatically classifies tasks:

  • CODEBASE_ANALYSIS (>20 files or >500KB) � Gemini
  • CODE_REVIEW ("review" keyword) � Qwen
  • CODE_GENERATION ("generate"/"write" keywords) � Qwen/Ollama
  • QUICK_QUERY (simple, no files) � Ollama Local

Web Dashboard

Oxide includes a real-time web dashboard for monitoring and control:

# Start backend server
uv run oxide-web

# Start frontend (in another terminal)
cd oxide/web/frontend && npm install && npm run dev

Access at http://localhost:3000

Features:

  • Real-time service status monitoring
  • Task execution history
  • System metrics (CPU, memory)
  • WebSocket live updates
  • Service health checks

See WEB_UI_GUIDE.md for complete setup guide.

Network Services Setup

Configure remote LLM services on your LAN:

# Setup Ollama on another machine
./scripts/setup_ollama_remote.sh --ip 192.168.1.100

# Setup LM Studio on laptop
./scripts/setup_lmstudio.sh --ip 192.168.1.50

# Test network services
uv run python scripts/test_network.py --all

# Scan network for services
uv run python scripts/test_network.py --scan 192.168.1.0/24

Development Status

Production Ready - MVP Complete!

  • [x] Project structure and dependencies
  • [x] Configuration system ✅
  • [x] Adapter implementations ✅ (Gemini, Qwen, Ollama, LM Studio)
  • [x] Task classification and routing ✅
  • [x] MCP server ✅
  • [x] Web UI dashboard ✅
  • [x] Network services support ✅
  • [x] Real-time monitoring ✅
  • [ ] Test suite
  • [ ] Production documentation

Requirements

  • Python 3.11+
  • uv package manager
  • Gemini CLI (optional)
  • Qwen CLI (optional)
  • Ollama (optional)

License

MIT

Author

yayoboy esoglobine@gmail.com

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured