Antigravity GLM MCP
A bridge between Gemini and GLM-4.5 models that provides 25 tools for automating complex coding tasks, file management, and system operations. It features a security-hardened architecture with sandboxing, persistent memory, and automatic file backups.
README
<div align="center">
π Antigravity GLM MCP
Gemini (Antigravity) β· GLM-4.5 λͺ¨λΈ λΈλ¦Ώμ§ MCP μλ²
볡μ‘ν μ½λ© μμ μ GLM AIμκ² μμνκ³ , 25κ°μ§ κ°λ ₯ν λκ΅¬λ‘ μλννμΈμ.
π λ¬Έμ Β· β‘ λΉ λ₯Έ μμ Β· π οΈ λꡬ Β· π‘οΈ λ³΄μ
</div>
β¨ μ μ΄ νλ‘μ νΈμΈκ°μ?
| λ¬Έμ μ | ν΄κ²°μ± |
|---|---|
| π³ 볡μ‘ν Docker μ€μ | β Zero-Docker: HTTPS μ§μ νΈμΆλ‘ μ¦μ μμ |
| π API ν€ μ μΆ μ°λ € | β 보μ κ°ν: νκ²½λ³μ νν°λ§, μλλ°μ€ μ μ© |
| π νμΌ μ€μ 볡ꡬ λΆκ° | β μλ λ°±μ : μμ /μμ μ λ²μ κ΄λ¦¬ |
| π§ μΈμ κ° μ 보 μμ€ | β μꡬ λ©λͺ¨λ¦¬: JSON κΈ°λ° μ₯κΈ° κΈ°μ΅ μ μ₯μ |
| β οΈ μνν μ λͺ λ Ή μ€ν | β νμ΄νΈλ¦¬μ€νΈ: μΉμΈλ λͺ λ Ήλ§ νμ© |
ποΈ μν€ν μ²
flowchart LR
subgraph "π₯οΈ Gemini Desktop"
A[Antigravity Agent]
end
subgraph "π§ MCP Server"
B[antigravity_glm_mcp]
C[25 Tools]
D[Security Layer]
end
subgraph "βοΈ Cloud APIs"
E[GLM-4.5 API]
F[Web Search]
end
subgraph "πΎ Local Storage"
G[Files & Git]
H[Memory DB]
I[Backups]
end
A <--> |MCP Protocol| B
B --> C
C --> D
D --> E
D --> F
D --> G
D --> H
D --> I
π λ¬Έμ
| π λ¬Έμ | π μ€λͺ |
|---|---|
| ποΈ μν€ν μ² | μμ€ν μ€κ³, 보μ κ³μΈ΅, ν΅μ νλ¦ μμΈ |
| π λꡬ λ νΌλ°μ€ | 25κ° λꡬ νλΌλ―Έν° λ° μλ΅ λͺ μΈ |
| β‘ ν΅μ€ννΈ | 5λΆ μ€μΉ κ°μ΄λ λ° μ²« μ¬μ© μμ |
β‘ λΉ λ₯Έ μμ (5λΆ)
π μ¬μ μꡬμ¬ν
- Python 3.11+ (κ°μνκ²½ κΆμ₯)
- Zhipu AI API ν€ λλ νΈν μλν¬μΈνΈ
π μ€μΉ λ°©λ²
# 1. μ μ₯μ ν΄λ‘
git clone https://github.com/coreline-ai/antigravity_glm_mcp.git
cd antigravity_glm_mcp
# 2. κ°μνκ²½ μμ± λ° νμ±ν
python3.11 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# 3. μλ μ€μΉ (κΆμ₯)
python scripts/install.py
<details> <summary><b>π μλ μ€μΉ (λμ)</b></summary>
# μμ‘΄μ± μ€μΉ
pip install -r requirements.txt
# MCP μ€μ νμΌμ μΆκ° (~/.gemini/settings.json λ±)
{
"mcpServers": {
"antigravity_glm_mcp": {
"command": "/path/to/.venv/bin/python",
"args": ["/path/to/antigravity_glm_mcp/src/server.py"],
"env": {
"PROJECT_ROOT": "/your/workspace",
"ZHIPU_API_KEY": "your-api-key",
"GLM_MODEL": "GLM-4.5",
"GLM_BASE_URL": "https://api.z.ai/api/coding/paas/v4",
"PYTHONPATH": "/path/to/antigravity_glm_mcp"
}
}
}
}
</details>
π οΈ μ 체 λꡬ λͺ©λ‘ (25κ°)
π§ μ§λ₯ μμ (Intelligence Delegation)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_cmd |
GLMμ 볡μ‘ν μμ μμ | task_description, context |
glm_bypass |
μμ ν둬ννΈ μ§μ μ μ‘ | prompt |
glm_image_analyze |
μ΄λ―Έμ§ λΆμ (Vision) | image_path, prompt |
π νμΌ μμ€ν (File System)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_file_read |
νμΌ λ΄μ© μ½κΈ° | path, encoding |
glm_file_create |
μ νμΌ μμ± | path, content, overwrite |
glm_file_edit |
λ¬Έμμ΄ μΉν μμ | path, old_string, new_string |
glm_file_delete |
νμΌ μμ (λ°±μ 보κ΄) | path |
glm_file_rollback |
μ΄μ λ²μ 볡μ | path, version |
glm_dir_list |
λλ ν 리 λͺ©λ‘ | path, recursive |
glm_grep |
μ κ·μ νμΌ κ²μ | pattern, path, case_sensitive |
π» μ½λ μ€ν (Code Execution)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_code_run |
Python μλλ°μ€ μ€ν | code, timeout |
glm_shell_exec |
νμ΄νΈλ¦¬μ€νΈ μ μ€ν | command, cwd |
πΏ Git νμ (Git Collaboration)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_git_status |
μ μ₯μ μν μ‘°ν | repo_path |
glm_git_commit |
λ³κ²½μ¬ν μ»€λ° | message, add_all |
glm_git_log |
μ»€λ° μ΄λ ₯ μ‘°ν | n, oneline |
glm_git_diff |
λ³κ²½μ¬ν λΉκ΅ | stat_only, commit |
π λ€νΈμν¬ (Network)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_http_request |
HTTP μμ² (SSRF λ°©μ§) | url, method, body |
glm_web_search |
DuckDuckGo μΉ κ²μ | query, max_results |
π§ λ©λͺ¨λ¦¬ & λ°μ΄ν° (Memory & Data)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_memory_save |
μ₯κΈ° λ©λͺ¨λ¦¬ μ μ₯ | key, value, category |
glm_memory_get |
λ©λͺ¨λ¦¬ μ‘°ν | key |
glm_memory_list |
μ 체 λ©λͺ¨λ¦¬ λͺ©λ‘ | category, limit |
glm_memory_delete |
λ©λͺ¨λ¦¬ μμ | key |
glm_db_query |
SQLite 쿼리 μ€ν | query, db_path, read_only |
π κ΄λ¦¬ & λ‘κΉ (Management)
| λꡬ | μ€λͺ | μ£Όμ νλΌλ―Έν° |
|---|---|---|
glm_schedule_task |
μμ μμ½ (Cron) | action, cron, command |
glm_action_log |
μμ΄μ νΈ νλ λ‘κ·Έ | limit, action_filter |
π‘οΈ λ³΄μ μν€ν μ²
μ΄ νλ‘μ νΈλ 4λ¨κ³ 보μ κ³μΈ΅μ ꡬνν©λλ€:
flowchart TB
subgraph "Layer 1: Sandbox"
A[PROJECT_ROOT κ²½λ‘ κ²μ¦]
end
subgraph "Layer 2: Network"
B[SSRF λ°©μ§ - λ΄λΆλ§ IP μ°¨λ¨]
C[DNS Rebinding λ°©μ΄]
end
subgraph "Layer 3: Execution"
D[RCE λ°©μ§ - νκ²½λ³μ νν°λ§]
E[μ νμ΄νΈλ¦¬μ€νΈ]
end
subgraph "Layer 4: Data"
F[μλ λ°±μ
]
G[API ν€ κ²©λ¦¬]
end
A --> B --> D --> F
| λ³΄νΈ λμ | μν | λ°©μ΄ μ‘°μΉ |
|---|---|---|
| νμΌ μμ€ν | Path Traversal | PROJECT_ROOT μΈλΆ μ κ·Ό μ°¨λ¨ |
| λ€νΈμν¬ | SSRF 곡격 | λ΄λΆλ§ IP(10.x, 172.x, 192.168.x) νν°λ§ |
| μ½λ μ€ν | RCE / ν€ μ μΆ | ZHIPU_API_KEY λ± λ―Όκ° νκ²½λ³μ μ°¨λ¨ |
| μ λͺ λ Ή | μμ€ν νκ΄΄ | rm -rf, sudo λ± μν λͺ
λ Ή μ°¨λ¨ |
[!WARNING]
glm_shell_execλ νμ΄νΈλ¦¬μ€νΈμ λ±λ‘λ μμ ν λͺ λ Ήλ§ μ€νν©λλ€.rm,sudo,chmod 777λ±μ μμ² μ°¨λ¨λ©λλ€.
π§ νκ²½λ³μ μ€μ
| λ³μλͺ | νμ | μ€λͺ | κΈ°λ³Έκ° |
|---|---|---|---|
ZHIPU_API_KEY |
β | GLM API μΈμ¦ ν€ | - |
PROJECT_ROOT |
β | μμ λμ λλ ν 리 | νμ¬ λλ ν 리 |
GLM_MODEL |
β | μ¬μ©ν λͺ¨λΈ | glm-4-plus |
GLM_BASE_URL |
β | API μλν¬μΈνΈ | https://open.bigmodel.cn/api/paas/v4 |
GLM_TIMEOUT |
β | μμ² νμμμ (μ΄) | 120 |
π νλ‘μ νΈ κ΅¬μ‘°
antigravity_glm_mcp/
βββ π README.md # μ΄ λ¬Έμ
βββ π requirements.txt # Python μμ‘΄μ±
βββ π pyproject.toml # ν¨ν€μ§ λ©νλ°μ΄ν°
β
βββ π src/ # μμ€ μ½λ
β βββ server.py # MCP μλ² μ§μ
μ
β βββ models.py # κ³΅ν΅ λͺ¨λΈ (ToolResponse λ±)
β β
β βββ π core/ # ν΅μ¬ μΈνλΌ
β β βββ config.py # μ€μ κ΄λ¦¬μ
β β βββ glm_client.py # GLM API ν΄λΌμ΄μΈνΈ
β β βββ sandbox.py # κ²½λ‘ λ³΄μ κ²μ¦
β β βββ backup.py # μλ λ°±μ
μμ€ν
β β
β βββ π tools/ # 25κ° λꡬ ꡬν
β βββ glm_cmd.py # μ§λ₯ μμ (cmd, bypass)
β βββ file_ops.py # νμΌ CRUD
β βββ dir_ops.py # λλ ν 리 μμ
β βββ grep_ops.py # νμΌ κ²μ
β βββ code_ops.py # μ½λ μ€ν
β βββ shell_ops.py # μ μ€ν
β βββ git_ops.py # Git νμ
β βββ http_ops.py # HTTP μμ²
β βββ web_ops.py # μΉ κ²μ
β βββ db_ops.py # DB 쿼리
β βββ memory_ops.py # λ©λͺ¨λ¦¬ κ΄λ¦¬
β βββ image_ops.py # μ΄λ―Έμ§ λΆμ
β βββ schedule_ops.py # μμ
μμ½
β βββ reporting.py # λ‘κ·Έ κ΄λ¦¬
β
βββ π scripts/ # μ νΈλ¦¬ν° μ€ν¬λ¦½νΈ
β βββ install.py # μλ μ€μΉ μ€ν¬λ¦½νΈ
β
βββ π docs/ # λ¬Έμ
β βββ ARCHITECTURE.md # μν€ν
μ² μμΈ
β βββ TOOLS.md # λꡬ λ νΌλ°μ€
β βββ QUICKSTART.md # λΉ λ₯Έ μμ κ°μ΄λ
β
βββ π tests/ # ν
μ€νΈ μ½λ
β βββ local_tools_test.py # λ‘컬 λꡬ ν΅ν© ν
μ€νΈ
β βββ simple_test.py # μ§λ₯ λꡬ ν
μ€νΈ
β
βββ π data/ # λ°νμ λ°μ΄ν° (Git μ μΈ)
βββ memory/ # μꡬ λ©λͺ¨λ¦¬ μ μ₯μ
βββ action_logs.jsonl # μμ΄μ νΈ νλ λ‘κ·Έ
π§ͺ ν μ€νΈ
# κ°μνκ²½ νμ±ν ν
# 1. .env νμΌ μμ± (κΆμ₯)
cp .env.sample .env
# .env νμΌμ μ΄μ΄ ZHIPU_API_KEYλ₯Ό μ
λ ₯νμΈμ.
# 2. λ‘컬 λꡬ ν
μ€νΈ (API ν€ λΆνμ)
./.venv/bin/python tests/local_tools_test.py
# 3. μ§λ₯ λꡬ ν
μ€νΈ (API ν€ νμ)
# .env νμΌμ΄ μλ€λ©΄ μ§μ export νμΈμ.
export ZHIPU_API_KEY="your-key"
./.venv/bin/python tests/simple_test.py
π λΌμ΄μ μ€
μ΄ νλ‘μ νΈλ MIT License νμ λ°°ν¬λ©λλ€. μμ λ‘κ² μ¬μ©, μμ , λ°°ν¬νμ€ μ μμ΅λλ€.
<div align="center">
Made with β€οΈ for Gemini Γ GLM Collaboration
</div>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.