figma-to-markdown-mcp
Converts Figma to Markdown. Strips out visual noise to drastically reduce LLM token consumption.
README
figma-compaction-mcp
Current version: 3.0.0
figma-compaction-mcp is an MCP server for Figma-link workflows. It fetches upstream Figma design context internally, prunes it into compact plain-text context, and returns that reduced result to the calling agent instead of the full upstream payload.
What It Is
This project is for teams that want agents to work from Figma node URLs without pushing raw upstream Figma MCP output into the caller model context whenever the bridge can safely handle the request.
The intended flow is simple:
- A user gives an agent a Figma node URL.
- The agent calls
get_figma_compact_context. - This server fetches upstream Figma context internally.
- The server compacts the upstream result into a small line-based DSL.
- The agent receives compact implementation context and works from that output.
Why Use It
The main reason to use this server is token reduction without losing implementation-critical facts.
Raw Figma MCP responses can be large enough to consume a meaningful part of the caller model context before implementation even begins. This bridge keeps that upstream payload inside the server whenever possible, compacts it first, and only returns the reduced result to the agent.
- Lower token usage for Figma-link prompts
- Smaller model-context footprint before implementation starts
- Cleaner implementation input for agents
- Less raw upstream noise in caller context
- Traceable output with node ids, typography tokens, asset refs, warnings, and fallback hints
- A built-in fallback path when the bridge cannot safely complete
How It Works
This server sits between your agent and the local Figma Desktop MCP server.
User prompt with Figma link
-> Agent calls get_figma_compact_context
-> figma-compaction-mcp connects to local Figma Desktop MCP
-> get_design_context / get_metadata
-> internal compaction
-> compact plain-text context returned to the agent
The public entrypoint is get_figma_compact_context.
figma_url: required full Figma node URLmode: optional compaction mode, one ofminimal,balanced,debugtask: optional intent hint, one ofimplement,inspect,summarizeinclude_assets: optional, defaulttrueinclude_text_specs: optional, defaulttrueinclude_trace_ids: optional, defaulttrueinclude_metadata: optional, defaulttruemax_output_chars: optional explicit output budget
When the bridge succeeds, it returns compact plain-text context plus structured fields for stats, traceability, warnings, and diagnostics. When the bridge cannot safely fetch or compact the node, it returns a fallback handoff so the agent can continue with standard Figma MCP tools directly.
Example compact output:
src|figma|get_design_context|4:5100|FILE_KEY
sum|Example screen|frame|375x876|535,258
el|4:5107|field_card|w343;layout:column;r20;p:16,20,20,20;bg:#ffffff
tx|4:5106|Section title|t1
ty|t1|Inter|600|20|24|#333333
as|imgAsset|asset|4:5107|asset_slot|/assets/example-image.png
Example URL shape:
https://www.figma.com/design/FILE_KEY/FILE_NAME?node-id=NODE_ID&m=dev
Requirements
To use the Figma-link bridge flow, you need:
- Figma Desktop
- Dev Mode enabled in Figma Desktop
- Desktop MCP server enabled in Figma Desktop
- Node.js 18+
Default upstream Figma MCP endpoint:
http://127.0.0.1:3845/mcp
Override with:
FIGMA_MCP_URL
Installation
Install globally:
npm install -g figma-compaction-mcp
Or run with npx:
npx figma-compaction-mcp
MCP Client Registration
Register this server in your MCP client.
Example using npx:
{
"mcpServers": {
"figma-compaction": {
"command": "npx",
"args": ["-y", "figma-compaction-mcp"]
}
}
}
Example using a global install:
{
"mcpServers": {
"figma-compaction": {
"command": "figma-compaction-mcp",
"args": []
}
}
}
Your client may use JSON, TOML, or another config format, but the command registration model is the same.
How To Use It
- Open Figma Desktop and enable Dev Mode and the desktop MCP server.
- Register
figma-compaction-mcpin your MCP client. - Give your agent a Figma node URL.
- Have the agent call
get_figma_compact_contextfirst. - Use the returned compact context for implementation, inspection, or summarization.
- If the server returns a fallback handoff, continue with the standard Figma MCP tools for the same node.
In practice:
- Small and medium components usually return compact context directly.
- Large screens can still return larger output when the retained structure, text, and assets matter.
balancedmode is the default for normal implementation work.- Only set
max_output_charswhen you intentionally want a hard output budget.
Limitations
- Final tool routing still depends on the MCP host or agent. This server can strongly guide usage, but it cannot forcibly override host-side routing.
- When the bridge cannot safely complete a request, it returns a compact fallback handoff instead of passing raw upstream payloads through this server response.
- Compaction is optimized for implementation relevance, so purely decorative wrappers and chrome-like nodes may be pruned outside inspect-oriented flows.
Other Information
- Release history: CHANGELOG.md
- Compact contract draft: SPEC.md
- Source repository: https://github.com/s9hn/figma-compaction-mcp
- Contributions: issues and pull requests are welcome on GitHub
- Issues: GitHub Issues
- License: MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.