
Canteen MCP
A Model Context Protocol server that provides structured access to canteen lunch menus for specific dates through a simple API integration.
Tools
get_lunch_menu
Get the lunch menu from the canteen for a specific date
README
Canteen MCP
A Model Context Protocol (MCP) server that provides access to the canteen's lunch menu via a simple API integration.
Description
Canteen MCP is a FastMCP-based server that exposes a tool for retrieving daily lunch menus from the canteen. It connects to a menu API and provides a structured interface for querying menu data for specific dates.
Features
- Get lunch menu for any specific date
- httpStream-based transport for real-time communication
- Environment-based configuration
- Type-safe API with input validation
Installation
npm install
Configuration
Copy the example environment file and update it with your values:
cp .env.example .env
Environment Variables
Variable | Description | Example |
---|---|---|
API_URL | URL of the lunch menu API | https://lunch-menu-ai.vercel.app/api/v1/menu |
PORT | Port for the MCP server | 8080 |
ENDPOINT | HTTP endpoint | /endpoint |
Usage
Start the server:
npm start
Available Tools
get_lunch_menu
Retrieves the lunch menu for a specific date.
- Parameters:
date
: String in YYYY-MM-DD format
- Returns: JSON string containing the menu data
- Example:
const result = await tool.execute({ date: "2024-10-05" });
Development
Prerequisites
- Node.js >= 18
- npm
Running in Development Mode
npm run dev
Docker
Building the Image
docker build -t canteen-mcp .
Running the Container
docker run -d \
-p 8080:3000 \
-e API_URL=your_api_url \
-e PORT=3000 \
-e ENDPOINT=/http \
--name canteen-mcp \
canteen-mcp
Using GitHub Container Registry
Pull the latest image:
docker pull ghcr.io/[your-username]/canteen-mcp:latest
Deployment
Deploying to Hetzner
- SSH into your Hetzner server:
ssh root@your-server-ip
- Install Docker if not already installed:
curl -fsSL https://get.docker.com | sh
- Create a docker-compose.yml file:
version: '3.8'
services:
canteen-mcp:
image: ghcr.io/c0dr/canteen-mcp:latest
restart: always
ports:
- "8080:3000"
environment:
- API_URL=your_api_url
- PORT=3000
- ENDPOINT=/http
- Start the service:
docker-compose up -d
License
This project is licensed under the MIT License - see the LICENSE file for details.
Based on https://github.com/punkpeye/fastmcp-boilerplate
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.