🤖 mcp-ollama-beeai
A minimal agentic app to interact with OLLAMA models leveraging multiple MCP server tools using BeeAI framework.
tamdilip
README
🤖 mcp-ollama-beeai
A minimal client app to interact with local OLLAMA models leveraging multiple MCP agent tools using BeeAI framework.
Below is a sample visual of this client app with chat interface, displaying the postgres database operation performed with thinking steps the AI has taken to use the right MCP agent and tranforming the request & response with LLM:
<video controls loop muted poster="https://raw.githubusercontent.com/tamdilip/mcp-ollama-beeai/docs/demo-pic.png" src="https://github.com/user-attachments/assets/618b76b5-111c-493f-a0fe-d974b915d619" title="Demo Video"></video>
Usage
📋 Pre-requisite
1. Local ollama server
Install and serve ollama in your local machine with the following commands.
- Make sure you have enough memory available in your machine, atleast 16GB RAM for models to perform.
- Skip this installation in your local, if you're going to use a remote server for model.
$ curl -fsSL https://ollama.com/install.sh | sh
$ ollama serve
$ ollama pull llama3.1
2. MCP servers list configuration
Add your MCP agents in the mcp-servers.json
file in root folder, for the app to pickup and work along with the LLM.
- Default servers included are postgres and fetch.
Make sure to update you postgres connection URL
- List of other MCP agent tools availabe for configuration - https://modelcontextprotocol.io/examples
3 .env
If you want to use a different LLM model and LLM server, override the below properties before npm start
OLLAMA_CHAT_MODEL=llama3.1
OLLAMA_BASE_URL=http://localhost:11434/api
🎮 Boot up your app
$ git clone https://github.com/tamdilip/mcp-ollama-beeai.git
$ cd mcp-ollama-beeai
$ npm i
$ npm start
Once the app is up and running, hit in Browser -> http://localhost:3000
Additional Context:
- By default on landing no MCP agent is referred for the questions.
- The respective MCP agent to be used a question can be selected from the
Server
&tools
dropdown in UI. BeeAI
framework is used for ease setup ofReAct
(Reason And Act) agent with MCP tools.Markdown
JS library is used to render the responses in proper readable visual format.
Happy coding :) !!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.