Ollama MCP Server

Ollama MCP Server

A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.

NightTrek

Virtualization
OS Automation
Developer Tools
Visit Server

Tools

push

Push a model to a registry

list

List models

cp

Copy a model

rm

Remove a model

serve

Start Ollama server

create

Create a model from a Modelfile

show

Show information for a model

run

Run a model

pull

Pull a model from a registry

chat_completion

OpenAI-compatible chat completion API

README

Ollama MCP Server

🚀 A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama's local LLM capabilities into your MCP-powered applications.

🌟 Features

Complete Ollama Integration

  • Full API Coverage: Access all essential Ollama functionality through a clean MCP interface
  • OpenAI-Compatible Chat: Drop-in replacement for OpenAI's chat completion API
  • Local LLM Power: Run AI models locally with full control and privacy

Core Capabilities

  • 🔄 Model Management

    • Pull models from registries
    • Push models to registries
    • List available models
    • Create custom models from Modelfiles
    • Copy and remove models
  • 🤖 Model Execution

    • Run models with customizable prompts
    • Chat completion API with system/user/assistant roles
    • Configurable parameters (temperature, timeout)
    • Raw mode support for direct responses
  • 🛠 Server Control

    • Start and manage Ollama server
    • View detailed model information
    • Error handling and timeout management

🚀 Getting Started

Prerequisites

  • Ollama installed on your system
  • Node.js and npm/pnpm

Installation

  1. Install dependencies:
pnpm install
  1. Build the server:
pnpm run build

Configuration

Add the server to your MCP configuration:

For Claude Desktop:

MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-server/build/index.js"],
      "env": {
        "OLLAMA_HOST": "http://127.0.0.1:11434"  // Optional: customize Ollama API endpoint
      }
    }
  }
}

🛠 Usage Examples

Pull and Run a Model

// Pull a model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "pull",
  arguments: {
    name: "llama2"
  }
});

// Run the model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "run",
  arguments: {
    name: "llama2",
    prompt: "Explain quantum computing in simple terms"
  }
});

Chat Completion (OpenAI-compatible)

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "chat_completion",
  arguments: {
    model: "llama2",
    messages: [
      {
        role: "system",
        content: "You are a helpful assistant."
      },
      {
        role: "user",
        content: "What is the meaning of life?"
      }
    ],
    temperature: 0.7
  }
});

Create Custom Model

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "create",
  arguments: {
    name: "custom-model",
    modelfile: "./path/to/Modelfile"
  }
});

🔧 Advanced Configuration

  • OLLAMA_HOST: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)
  • Timeout settings for model execution (default: 60 seconds)
  • Temperature control for response randomness (0-2 range)

🤝 Contributing

Contributions are welcome! Feel free to:

  • Report bugs
  • Suggest new features
  • Submit pull requests

📝 License

MIT License - feel free to use in your own projects!


Built with ❤️ for the MCP ecosystem

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
MCP Package Docs Server

MCP Package Docs Server

Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.

Featured
Local
TypeScript
Claude Code MCP

Claude Code MCP

An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.

Featured
Local
JavaScript
@kazuph/mcp-taskmanager

@kazuph/mcp-taskmanager

Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.

Featured
Local
JavaScript
Linear MCP Server

Linear MCP Server

Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.

Featured
JavaScript
mermaid-mcp-server

mermaid-mcp-server

A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.

Featured
JavaScript
Jira-Context-MCP

Jira-Context-MCP

MCP server to provide Jira Tickets information to AI coding agents like Cursor

Featured
TypeScript
Linear MCP Server

Linear MCP Server

A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Featured
JavaScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python