
LiuRen MCP Server
Enables traditional Chinese divination calculations through Liu Ren (六壬) chart generation. Provides tools for calculating Liu Ren divination charts based on specified dates and times, along with current time retrieval functionality.
README
六壬MCP服务器
基于kentang2017/kinliuren项目实现的六壬排盘MCP服务器。
参考MCPQimenJson项目重构,采用简洁的架构和标准的MCP协议。
功能特点
- ✅ 标准的MCP (Model Context Protocol) 服务器
- ✅ 支持JSON-RPC 2.0协议通信
- ✅ 提供六壬排盘计算工具
- ✅ 提供当前时间获取工具
- ✅ 直接调用kinliuren库,输出原始结果
- ✅ 简洁的架构,无复杂封装层
安装依赖
pip install -r requirements.txt
MCP工具说明
1. get_current_time
获取当前系统时间,返回详细的时间信息。
输入参数: 无
输出: 当前时间的详细JSON信息
2. calculate_liuren
计算指定时间的六壬排盘,返回完整的六壬数据。
输入参数:
datetime_str
: 时间字符串,格式YYYY-MM-DD HH:MM:SS
输出: 包含六壬排盘完整数据的JSON结果
作为MCP服务器使用
该工具设计为MCP服务器,通过标准输入输出进行JSON-RPC通信:
// 初始化请求
{"jsonrpc": "2.0", "id": 1, "method": "initialize", "params": {}}
// 工具列表请求
{"jsonrpc": "2.0", "id": 2, "method": "tools/list", "params": {}}
// 工具调用请求
{"jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": {
"name": "calculate_liuren",
"arguments": {"datetime_str": "2024-01-15 14:30:00"}
}}
输出格式
六壬排盘结果
工具返回完整的六壬排盘JSON数据:
{
"success": true,
"data": {
"year": 2024,
"month": 1,
"day": 15,
"hour": 14,
"minute": 30,
"solar_term": "小寒",
"lunar_month": "腊",
"day_ganzhi": "癸丑",
"hour_ganzhi": "庚申",
"liuren_data": {
"格局": ["比用", "知一"],
"日馬": "寅",
"三傳": {
"初傳": ["戌", "合", "財", "丙"],
"中傳": ["巳", "陰", "子", "癸"],
"末傳": ["子", "龍", "父", "戊"]
},
"四課": {
"四課": ["戌卯", "合"],
"三課": ["卯申", "常"],
"二課": ["辰酉", "玄"],
"一課": ["酉甲", "雀"]
},
"天地盤": {
"天盤": ["午", "未", "申", "酉", "戌", "亥", "子", "丑", "寅", "卯", "辰", "巳"],
"地盤": ["亥", "子", "丑", "寅", "卯", "辰", "巳", "午", "未", "申", "酉", "戌"],
"天將": ["后", "貴", "蛇", "雀", "合", "勾", "龍", "空", "虎", "常", "玄", "陰"]
}
// ... 更多六壬数据
}
}
}
时间获取结果
{
"success": true,
"data": {
"year": 2024,
"month": 1,
"day": 15,
"hour": 14,
"minute": 30,
"second": 45,
"datetime_str": "2024-01-15 14:30:45",
"weekday": "Monday",
"weekday_cn": "星期一",
"timestamp": 1705323045
}
}
项目结构
MCPliuren/
├── liuren_mcp.py # 主要的六壬MCP服务器
├── config.py # 配置文件和辅助函数
├── requirements.txt # 项目依赖
└── README.md # 说明文档
依赖库
- kinliuren: 六壬排盘计算库
- sxtwl: 中国传统天文历法计算库
技术实现
- MCP协议支持: 实现完整的JSON-RPC 2.0协议
- 简洁架构: 直接调用kinliuren库,无复杂封装
- 配置分离: 使用独立的config.py处理配置逻辑
- 错误处理: 完善的异常处理和错误响应
- 标准化输出: 返回结构化的JSON结果
MCP协议特性
- initialize: 服务器初始化
- tools/list: 获取可用工具列表
- tools/call: 调用具体工具
- 错误处理: 标准JSON-RPC错误响应
注意事项
- 确保Python版本>=3.6
- 需要网络连接来安装依赖包
- 时间计算基于中国传统历法
- 作为MCP服务器使用时,通过标准输入输出进行通信
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.