Kimi Coding MCP
Wraps the Kimi Coding Search and Fetch APIs into MCP tools for web searching and content retrieval. It enables LLMs to perform targeted searches and crawl web pages using standardized interfaces.
README
Kimi Coding MCP
Kimi Coding MCP 是一个把 Kimi Coding Search / Fetch 接口封装成 MCP 工具的服务,适合部署到服务器后,以远程 MCP 的方式接入你的客户端。
它提供两个工具:
kimi_search:调用POST https://api.kimi.com/coding/v1/searchkimi_fetch:调用POST https://api.kimi.com/coding/v1/fetch
核心实现是 server.py,容器入口是 Dockerfile。
1. 快速开始
安装依赖:
pip install -r requirements.txt
远程模式启动:
python server.py --transport streamable-http --host 0.0.0.0 --port 8000
默认 MCP 地址:
http://127.0.0.1:8000/mcp
如果你部署在带 HTTPS 的域名上,对外地址通常就是:
https://your-domain.com/mcp
2. 工具说明
kimi_search
输入参数:
{
"text_query": "食贫道 最新视频 2025 2026",
"limit": 10,
"enable_page_crawling": false,
"timeout_seconds": 30
}
kimi_fetch
输入参数:
{
"url": "https://search.bilibili.com/all?keyword=食贫道"
}
3. Docker 部署
方式 A:不带代理,直接暴露端口
这种方式适合内网使用、临时测试,或者你已经有其他网关负责对外转发。
构建镜像:
docker build -t kimi-coding-mcp .
单租户模式:
docker run -d \
--name kimi-coding-mcp \
-p 8000:8000 \
-e KIMI_API_KEY=sk-kimi-你的key \
kimi-coding-mcp
多租户模式:
docker run -d \
--name kimi-coding-mcp \
-p 8000:8000 \
kimi-coding-mcp
部署完成后,对外 MCP 地址通常是:
http://你的服务器IP:8000/mcp
方式 B:带代理,通过 HTTPS 域名访问
这种方式适合公网部署。仓库里已经提供了 compose.yaml 和 Caddyfile。
- 复制环境变量模板:
cp .env.production.example .env.production
- 编辑
.env.production
单租户模式:
KIMI_API_KEY=sk-kimi-你的key
APP_DOMAIN=kimi-mcp.example.com
多租户模式:
KIMI_API_KEY=
APP_DOMAIN=kimi-mcp.example.com
-
确保域名已经解析到服务器公网 IP,并放行
80和443 -
启动:
docker compose up -d --build
部署完成后,对外 MCP 地址通常是:
https://kimi-mcp.example.com/mcp
4. 客户端配置
推荐使用多租户模式,让每个客户端自己携带 Kimi key。
多租户模式:
{
"mcpServers": {
"kimi-coding-remote": {
"type": "streamable_http",
"url": "https://kimi-mcp.example.com/mcp",
"headers": {
"X-Kimi-Api-Key": "sk-kimi-替换成你的key"
}
}
}
}
单租户模式:
{
"mcpServers": {
"kimi-coding-remote": {
"type": "streamable_http",
"url": "https://kimi-mcp.example.com/mcp"
}
}
}
远程模式下,API key 的优先级是:
X-Kimi-Api-Key请求头Authorization: Bearer sk-...请求头- 服务端环境变量
KIMI_API_KEY
5. 环境变量
常用环境变量如下:
KIMI_API_KEY=sk-kimi-你的key
KIMI_BASE_URL=https://api.kimi.com/coding/v1
KIMI_USER_AGENT=KimiCLI/1.24.0
KIMI_MSH_PLATFORM=kimi_cli
KIMI_MSH_VERSION=1.24.0
KIMI_DEVICE_NAME=YOUR-PC
KIMI_DEVICE_MODEL=Windows 11 AMD64
KIMI_OS_VERSION=10.0.26200
KIMI_DEVICE_ID=自定义设备ID
KIMI_LOG_DIR=logs
KIMI_LOG_LEVEL=INFO
KIMI_LOG_MAX_BYTES=5242880
KIMI_LOG_BACKUP_COUNT=3
KIMI_LOG_PREVIEW_BYTES=100
MCP_TRANSPORT=streamable-http
MCP_HOST=0.0.0.0
MCP_PORT=8000
MCP_STREAMABLE_HTTP_PATH=/mcp
如果没有显式设置 KIMI_DEVICE_ID,服务会优先读取 ~/.kimi/device_id;读不到时才会按 device_name 生成一个稳定 UUID。
日志默认写入 logs/server.log,并按大小轮转。日志会记录:
- 调用的 endpoint
- 入参预览(默认最多 100 字节)
- 返回结果预览(默认最多 100 字节)
- 状态码与耗时
为了安全,api_key、authorization、token、secret 等敏感字段会自动脱敏。
本地运行时可以参考 .env.example,带代理的 Docker 部署可以参考 .env.production.example。
6. 本地调试
如果你需要本地调试 stdio 模式,可以直接运行:
python server.py --transport stdio
如果你需要本地验证容器:
docker build -t kimi-coding-mcp .
docker run --rm -p 8000:8000 -e KIMI_API_KEY=sk-kimi-你的key kimi-coding-mcp
7. 说明
kimi_search使用text_query、limit、enable_page_crawling、timeout_seconds请求搜索接口;如果响应是 JSON,会自动格式化。kimi_fetch使用url请求抓取接口,并默认按Accept: text/markdown返回文本。
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.