
Base Implementation Framework
A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
jsmiff
README
MCP Base - A Generic Model Context Protocol Framework
This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.
📋 Features
- Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
- Generic MCP Client: A client for connecting to any MCP server
- Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
- Supabase Integration: Built-in support for Supabase vector database
- Modular Design: Clearly organized structure for resources, tools, and prompts
- Sample Templates: Example implementations to help you get started quickly
🛠️ Directory Structure
_mcp-base/
├── server.ts # Main MCP server implementation
├── client.ts # Generic MCP client
├── utils/ # Utility services
│ ├── ollama_embedding.ts # Embedding generation with Ollama
│ └── ollama_text_generation.ts # Text generation with Ollama
├── tools/ # Tool implementations
│ └── sample-tool.ts # Example tool template
├── resources/ # Resource implementations
│ └── sample-resource.ts # Example resource template
├── prompts/ # Prompt implementations
│ └── sample-prompt.ts # Example prompt template
└── README.md # This documentation
🚀 Getting Started
Prerequisites
- Node.js and npm/pnpm
- Ollama for local embedding and text generation
- Supabase account for vector storage
Environment Setup
Create a .env
file with the following variables:
PORT=3000
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_KEY=your-service-key
OLLAMA_URL=http://localhost:11434
OLLAMA_EMBED_MODEL=nomic-embed-text
OLLAMA_LLM_MODEL=llama3
SERVER_MODE=http # 'http' or 'stdio'
Server Initialization
- Import the required modules
- Register your resources, tools, and prompts
- Start the server
// Import base server and utilities
import server from "./server";
import { registerSampleResources } from "./resources/sample-resource";
import { registerSampleTool } from "./tools/sample-tool";
import { registerSamplePrompts } from "./prompts/sample-prompt";
// Initialize database if needed
async function initializeDatabase() {
// Your database initialization logic
}
// Register your components
registerSampleResources(server, supabase);
registerSampleTool(server, textGenerator, embeddings, supabase);
registerSamplePrompts(server, supabase);
// Start the server
startServer();
Client Usage
import MCPClient from "./client";
// Create a client instance
const client = new MCPClient({
serverUrl: "http://localhost:3000",
});
// Example: Call a tool
async function callSampleTool() {
const result = await client.callTool("sample-tool", {
query: "example query",
maxResults: 5,
});
console.log(result);
}
// Example: Read a resource
async function readResource() {
const items = await client.readResource("items://all");
console.log(items);
}
// Example: Get a prompt
async function getPrompt() {
const prompt = await client.getPrompt("simple-prompt", {
task: "Explain quantum computing",
});
console.log(prompt);
}
// Don't forget to disconnect when done
await client.disconnect();
📚 Extending the Framework
Creating a New Tool
- Create a new file in the
tools/
directory - Define your tool function and schema using Zod
- Implement your tool logic
- Register the tool in your server
Creating a New Resource
- Create a new file in the
resources/
directory - Define your resource endpoints and schemas
- Implement your resource logic
- Register the resource in your server
Creating a New Prompt
- Create a new file in the
prompts/
directory - Define your prompt schema and parameters
- Implement your prompt template
- Register the prompt in your server
📄 License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Neon Database
MCP server for interacting with Neon Management API and databases
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
React MCP
react-mcp integrates with Claude Desktop, enabling the creation and modification of React apps based on user prompts
Atlassian Integration
Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.

Any OpenAI Compatible API Integrations
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
Exa MCP
A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.
AI 图像生成服务
可用于cursor 集成 mcp server