Personal Library MCP Server
A demo server that allows AI models to manage a personal reading list stored in a local SQLite database. It provides tools for searching, adding, and updating books while demonstrating core Model Context Protocol features like resources and tools.
README
Personal Library MCP Server Demo
What is this?
This is a functional Model Context Protocol (MCP) server built with the FastMCP framework. It provides a structured interface for an AI model to interact with a local SQLite database that tracks a personal reading list.
Why use it?
This project is a very simple demo designed to see the Model Context Protocol (MCP) in action. It serves as a minimal, "Hello World" style example to help you nail the basics of:
- Resources: Exposing data (like a list of books) as readable URI-based resources.
- Tools: Providing actionable functions (like adding or searching books) that an AI can call.
- Client-Server Communication: Demonstrating how a client and server interact using the standard
stdiotransport.
Getting Started
Prerequisites
- uv installed on your system.
- Python 3.10 or later.
Setup and Running the Demo
-
Initialize the Database: Create the SQLite database and populate it with sample data:
uv run python init_db.py -
Run the Smoke Test: This script acts as a smoke test for your MCP server. It starts the server in the background and simulates how an AI model would interact with it (reading resources, calling tools) without needing an actual AI model connected:
uv run python main.py
Project Structure
server.py: The MCP server implementation usingFastMCP.main.py: A smoke test script that demonstrates how to interact with the server.init_db.py: A setup script to create the localbooks.dbSQLite database.pyproject.toml: Project configuration and dependencies (managed byuv).books.db: The local SQLite database (created after runninginit_db.py).
Using as a Tool with AI Assistants
You can connect this server to any MCP-compatible client. Replace /absolute/path/to/mcp-demo with /Users/sanka/Documents/workspace/mcp-demo in the examples below.
1. Gemini CLI
You can add the server automatically using the Gemini CLI:
gemini mcp add --scope project personal-library uv --directory $(pwd) run python server.py
Or manually add this to .gemini/settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"],
"trust": true
}
}
}
2. Claude Desktop
Add this to your claude_desktop_config.json (typically in ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
3. Cline (VS Code Extension)
Open the MCP Settings in Cline or edit cline_mcp_settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
Sample Prompts for AI Agents
Once you've connected the server to your favorite AI assistant, try these prompts:
- List Resources: "What books are currently in my reading list?"
- Search: "Find 'The Martian' in my library." or "Do I have any books by Frank Herbert?"
- Add a Book: "Add 'Project Hail Mary' by Andy Weir to my library. It's a Sci-Fi book from 2021."
- Update Status: "I just finished reading 'Dune', can you mark it as read?" or "I just bought 'The Road', mark it as owned."
- Check Details: "Show me the full metadata for 'The Lord of the Rings'."
- Combined Task: "Look at my library and tell me which Sci-Fi books I haven't read yet."
Naming Conventions
- Server Name (Configuration): The key used in
settings.json(e.g.,"personal-library") is a unique identifier for your AI client to manage multiple servers. - Display Name (Code): The name passed to
FastMCP("Personal Library Manager")inserver.pyis what appears in the UI of apps like Claude Desktop. - Tool/Resource Names: These (e.g.,
add_book,library://...) must match exactly betweenserver.pyandmain.py.
You don't need to use these names in your prompts! The AI assistant automatically discovers all available tools and resources once the server is connected.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.