MCP Ollama Consult Server
Enables consulting with local Ollama models for reasoning from alternative viewpoints. Supports sending prompts to Ollama models and listing available models on your local Ollama instance.
README
MCP Ollama Consult Server
An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.
Features
- consult_ollama: Send prompts to Ollama models and get responses
- list_ollama_models: List available models on the local Ollama instance
Installation
- Ensure you have Node.js installed
- Install dependencies and build:
npm i
npm run build
Usage
Make sure Ollama is running locally (default: http://localhost:11434).
Start the MCP server:
npm start
Or for development:
npm run dev
Configuration
Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:
OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start
Memory (remember_consult) configuration
The server implements a remember_consult tool that will try, in order:
- REMEMBER_MCP_CONFIG environment variable (JSON config or simple command string)
- VS Code
mcp.jsonentries (looks for a server key containingrememberthenmemory) - MEMORY_MCP_CMD / MEMORY_MCP_ARGS environment variables
- Falls back to a local file store at
MEMORY_DIRor/tmp/mcp-consult-memory
Examples:
Use a simple stdio command via env (no shell quoting here):
REMEMBER_MCP_CONFIG='{"type":"stdio","command":"/usr/bin/node","args":["/path/to/memory-server.js"]}' npm start
Or point to a memory server described in your VS Code mcp.json (the server keys remember/memory are automatically detected).
If no memory MCP is available, the tool will write JSON observations to MEMORY_DIR (default /tmp/mcp-consult-memory).
Demo client
There is a simple demo client that spawns the server over stdio and exercises the tools (list, consult, compare, remember):
npm run build
npm run demo
Tests & CI
Unit tests (Vitest) and a GitHub Actions workflow were added. Run tests locally with:
npm install
npm test
Docker
To run with Docker, build the image:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY dist/ ./dist/
CMD ["node", "dist/index.js"]
Requirements
- Node.js 18+
- Ollama running locally or accessible via HTTP
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.