
RAGmonsters Custom PostgreSQL MCP Server
A domain-specific MCP server that provides optimized API access to the RAGmonsters fictional monster dataset, enabling more efficient and secure interactions compared to generic SQL queries.
README
Custom PostgreSQL MCP Server for RAGmonsters
Overview
This repository demonstrates a more advanced approach to integrating Large Language Models (LLMs) with databases using the Model Context Protocol (MCP). While generic MCP PostgreSQL servers allow LLMs to explore databases through raw SQL queries, this project takes a different approach by creating a custom MCP server that provides a domain-specific API tailored to the application's needs.
This implementation uses FastMCP, a high-performance implementation of the Model Context Protocol, which provides improved efficiency and reliability for tool-based interactions with LLMs.
This project uses the RAGmonsters dataset as its foundation. RAGmonsters is an open-source project that provides a rich, fictional dataset of monsters with various attributes, abilities, and relationships - specifically designed for demonstrating and testing Retrieval-Augmented Generation (RAG) systems.
The Problem with Generic MCP Database Access
Generic MCP PostgreSQL servers provide LLMs with a query
tool that allows them to:
- Explore database schemas
- Formulate SQL queries based on natural language questions
- Execute those queries against the database
While this approach works, it has several limitations for real-world applications:
- Cognitive Load: The LLM must understand the entire database schema
- Inefficiency: Multiple SQL queries are often needed to answer a single question
- Security Concerns: Raw SQL access requires careful prompt engineering to prevent injection attacks
- Performance: Complex queries may be inefficient if the LLM doesn't understand the database's indexing strategy
- Domain Knowledge Gap: The LLM lacks understanding of business rules and domain-specific constraints
About RAGmonsters Dataset
RAGmonsters is an open dataset specifically designed for testing and demonstrating Retrieval-Augmented Generation (RAG) systems. It contains information about fictional monsters with rich attributes, abilities, and relationships - making it perfect for natural language querying demonstrations.
The PostgreSQL version of RAGmonsters provides a well-structured relational database with multiple tables and relationships, including:
- Monsters with various attributes (attack power, defense, health, etc.)
- Abilities that monsters can possess
- Elements (fire, water, earth, etc.) with complex relationships
- Habitats where monsters can be found
- Evolution chains and relationships between monsters
This rich, interconnected dataset is ideal for demonstrating the power of domain-specific APIs versus generic SQL access.
Our Solution: Domain-Specific MCP API
This project demonstrates how to build a custom MCP server that provides a higher-level, domain-specific API for the RAGmonsters dataset. Instead of exposing raw SQL capabilities, our MCP server offers purpose-built functions that:
- Abstract Database Complexity: Hide the underlying schema and SQL details
- Provide Domain-Specific Operations: Offer functions that align with business concepts
- Optimize for Common Queries: Implement efficient query patterns for frequently asked questions
- Enforce Business Rules: Embed domain-specific logic and constraints
- Improve Security: Limit the attack surface by removing direct SQL access
Example: Domain-Specific API vs. Generic SQL
Generic MCP PostgreSQL Approach:
User: "What are the top 3 monsters with the highest attack power that are vulnerable to fire?"
LLM: (Must understand schema, joins, and SQL syntax)
1. First query to understand the schema
2. Second query to find monsters with attack power
3. Third query to find vulnerabilities
4. Final query to join and filter results
Our Custom MCP Server Approach:
User: "What are the top 3 monsters with the highest attack power that are vulnerable to fire?"
LLM: (Uses our domain-specific API)
1. Single call: getMonsters({ vulnerableTo: "fire", sortBy: "attackPower", limit: 3 })
Project Structure
├── .env.example # Example environment variables
├── package.json # Node.js project configuration
├── README.md # This documentation
├── img/ # Images for documentation
├── scripts/
│ ├── testMcpServer.js # Test script for the MCP server
│ └── testLogger.js # Logger for test script
├── src/
│ ├── index.js # Main application server
│ ├── mcp-server/ # Custom MCP server implementation with FastMCP
│ │ ├── index.js # Server entry point
│ │ ├── tools/ # Domain-specific tools
│ │ │ ├── index.js # Tool registration
│ │ │ └── monsters.js # Monster-related operations
│ │ └── utils/ # Helper utilities
│ │ └── logger.js # Logging functionality
│ ├── llm.js # LangChain integration for LLM
│ └── public/ # Web interface files
│ └── index.html # Chat interface
Features
- Custom MCP Server with FastMCP: High-performance domain-specific API for RAGmonsters data
- Optimized Queries: Pre-built efficient database operations
- Business Logic Layer: Domain rules and constraints embedded in the API
- Structured Response Format: Consistent JSON responses for LLM consumption
- Comprehensive Logging: Detailed logging for debugging and monitoring
- Test Suite: Scripts to verify server functionality
Planned Features
- LangChain.js Integration: For LLM interactions
- Web Interface: Simple chat interface to interact with the data
- Deployment on Clever Cloud: Easy deployment instructions
Benefits of This Approach
- Improved Performance: Optimized queries and caching strategies
- Better User Experience: More accurate and faster responses
- Reduced Token Usage: LLM doesn't need to process complex SQL or schema information
- Enhanced Security: No direct SQL access means reduced risk of injection attacks
- Maintainability: Changes to the database schema don't require retraining the LLM
- Scalability: Can handle larger and more complex databases
Getting Started
Installation
- Clone this repository
- Install dependencies:
npm install
- Copy
.env.example
to.env
and configure your PostgreSQL connection string - Run the test script:
node scripts/testMcpServer.js
Available Tools
The MCP server provides the following tools:
-
getMonsters - Get a list of monsters with optional filtering, sorting, and pagination
- Parameters: filters (category, habitat, rarity), sort (field, direction), limit, offset
-
getMonsterById - Get detailed information about a specific monster by ID
- Parameters: monsterId
-
add - Simple utility to add two numbers (for testing)
- Parameters: a, b
Prerequisites
- Node.js 23 or later
- PostgreSQL database with RAGmonsters data
- Access to an LLM API (e.g., OpenAI)
- FastMCP package (included in dependencies)
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- RAGmonsters for the sample dataset
- Model Context Protocol for the MCP specification
- FastMCP for the high-performance MCP implementation
- Clever Cloud for hosting capabilities
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.