๐Ÿ” ๐Ÿค– ๐ŸŒ Ollama Chat with MCP

๐Ÿ” ๐Ÿค– ๐ŸŒ Ollama Chat with MCP

This app demonstrates use of MCP server and client in a local model chat via Ollama that incorporates web search via Serper.

redbuilding

Research & Data
Visit Server

README

๐Ÿ” ๐Ÿค– ๐ŸŒ Ollama Chat with MCP

A powerful demonstration of integrating local LLMs with real-time web search capabilities using the Model Context Protocol (MCP).

Overview

Ollama Chat with MCP showcases how to extend a local language model's capabilities through tool use. This application combines the power of locally running LLMs via Ollama with up-to-date web search functionality provided by an MCP server.

The project consists of three main components:

  • MCP Web Search Server: Provides web search functionality using the Serper.dev API
  • Terminal Client: A CLI interface for chat and search interactions
  • Web Frontend: A user-friendly Gradio-based web interface

By using this architecture, the application demonstrates how MCP enables local models to access external tools and data sources, significantly enhancing their capabilities.

Features

  • ๐Ÿ”Ž Web-enhanced chat: Access real-time web search results during conversation
  • ๐Ÿง  Local model execution: Uses Ollama to run models entirely on your own hardware
  • ๐Ÿ”Œ MCP integration: Demonstrates practical implementation of the Model Context Protocol
  • ๐ŸŒ Dual interfaces: Choose between terminal CLI or web-based GUI
  • ๐Ÿ“Š Structured search results: Clean formatting of web search data for optimal context
  • ๐Ÿ”„ Conversation memory: Maintains context throughout the chat session

Requirements

  • Python 3.11+
  • Ollama installed and running locally
  • A Serper.dev API key (free tier available)
  • Internet connection for web searches

Installation

  1. Clone the repository:

    git clone https://github.com/redbuilding/ollama-chat-with-mcp.git
    cd ollama-chat-with-mcp
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Create a .env file in the project root with your Serper.dev API key:

    SERPER_API_KEY=your_serper_api_key_here
    
  4. Ensure Ollama is installed and the hardcoded model is available (default qwen2.5:14b):

    ollama pull qwen2.5:14b
    

Usage

Starting the Web Interface

To use the web-based interface:

python chat_frontend.py

This will start the Gradio web interface, typically accessible at http://localhost:7860

Using the Terminal Client

To use the command-line interface:

python chat_client.py

Search Commands

In both interfaces, you can use special commands to trigger web searches:

  • Search and summarize: #search for "financial market outlook April 2025"
  • Search and answer a question: #search for "reality TV this week" and what happened recently?

Other Commands

  • Clear conversation history: #clear
  • Exit the application: exit or quit

How It Works

  1. The MCP server exposes a web search capability as a tool
  2. When a user requests search information, the client sends a query to the MCP server
  3. The server processes the request through Serper.dev and returns formatted results
  4. The client constructs an enhanced prompt including the search results
  5. The local Ollama model receives this prompt and generates an informed response
  6. The response is displayed to the user with search attribution

File Structure

  • server.py - MCP server with web search tool
  • chat_client.py - Terminal-based chat client
  • chat_frontend.py - Gradio web interface client
  • requirements.txt - Project dependencies
  • .env - Configuration for API keys (create this file & add your key for Serper)

Customization

  • Change the Ollama model by modifying the model name in the chat client files
  • Adjust the number of search results by changing the max_results parameter
  • Modify the prompt templates to better suit your specific use case

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Recommended Servers

Crypto Price & Market Analysis MCP Server

Crypto Price & Market Analysis MCP Server

A Model Context Protocol (MCP) server that provides comprehensive cryptocurrency analysis using the CoinCap API. This server offers real-time price data, market analysis, and historical trends through an easy-to-use interface.

Featured
TypeScript
MCP PubMed Search

MCP PubMed Search

Server to search PubMed (PubMed is a free, online database that allows users to search for biomedical and life sciences literature). I have created on a day MCP came out but was on vacation, I saw someone post similar server in your DB, but figured to post mine.

Featured
Python
dbt Semantic Layer MCP Server

dbt Semantic Layer MCP Server

A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.

Featured
TypeScript
mixpanel

mixpanel

Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Featured
TypeScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python
Nefino MCP Server

Nefino MCP Server

Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.

Official
Python
Vectorize

Vectorize

Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.

Official
JavaScript
Mathematica Documentation MCP server

Mathematica Documentation MCP server

A server that provides access to Mathematica documentation through FastMCP, enabling users to retrieve function documentation and list package symbols from Wolfram Mathematica.

Local
Python
kb-mcp-server

kb-mcp-server

An MCP server aimed to be portable, local, easy and convenient to support semantic/graph based retrieval of txtai "all in one" embeddings database. Any txtai embeddings db in tar.gz form can be loaded

Local
Python
Research MCP Server

Research MCP Server

The server functions as an MCP server to interact with Notion for retrieving and creating survey data, integrating with the Claude Desktop Client for conducting and reviewing surveys.

Local
Python