Figma MCP Server
Enables AI assistants to read Figma design files and automatically map responsive relationships between mobile and desktop screens to generate accurate frontend code. Eliminates manual copy-pasting by providing direct access to design tokens, dimensions, and screen layouts within AI-powered IDEs.
README
Figma MCP Server
This is a custom Model Context Protocol (MCP) server for Figma. It connects your Figma designs directly to AI-powered IDEs (like Cursor, Windsurf, or any editor that supports MCP).
Instead of manually copy-pasting CSS, hex codes, or dimensions from Figma into your prompts, you can just give the AI your Figma file ID. The AI will use this server to read your designs, figure out how screens scale from desktop to mobile, and write accurate frontend code.
How it works
The project has two ways to run:
- MCP Agent (
mcp-agent.js): This is the main use case. It connects directly to your IDE over stdio, giving your AI assistant custom tools to query your Figma files. - Standard Web Server (
server.js): It also includes an Express server. You can run this if you just want a standard REST API to pull your Figma data into other scripts or apps.
Under the hood, the server handles all the annoying parts of dealing with the Figma API. It batches requests to avoid hitting rate limits, caches responses so your AI doesn't wait around for data, and handles background retries if the network drops.
Tools it gives your AI
Once connected, your AI assistant will be able to use three new tools:
fetch_file: Pulls the raw JSON structure of a Figma file.list_screens: Finds all the frames in the file and groups them by page.build_responsive_screens: This is the most useful one. It looks at your design and figures out which mobile screens belong to which desktop screens. When the AI uses this, it knows exactly what to do when writing responsive CSS.
Setup Instructions
1. What you need first
- Node.js installed (v18 or newer should be fine).
- A Figma Personal Access Token (you can get this from your Figma profile settings).
2. Install dependencies
Clone the repo and install the packages:
npm install
3. Add your Figma token
Make a .env file in the root folder, copy everything in it from example.env and add your token:
FIGMA_TOKEN=your_token_here
4. Running the server
For AI IDEs (MCP Mode): To plug this into your IDE, run:
npm run mcp
To actually connect it, head to your editor's MCP settings, add a new MCP server, select the command option, and tell it to run node with the absolute path to the mcp-agent.js file in this folder.
Here's an example of how to connect it in Antigravity:
{
"mcpServers": {
"figma-mcp-v2": {
"command": "node",
"args": ["/absolute/path/to/figma_mcp/mcp-agent.js"],
"env": {
"FIGMA_TOKEN": "your_token_here",
"NODE_ENV": "production",
"DOTENV_QUIET": "true"
}
}
}
}
For the REST API: If you just want the local web server:
npm run dev
It runs on http://localhost:3010 by default.
Example Prompt
Use figma-mcp-v2 tool: call build_responsive_screens with fileId "YOUR_FIGMA_FILE_ID_HERE"
Then generate clean responsive React + Tailwind code for all returned screens, one by one.
Where to get the file ID
- Go to your web browser and head to
figma.com - Navigate to any of your figma files
- Look at the URL in your browser's address bar. It will look something like this:
https://www.figma.com/file/FILE_ID_HERE/file-name - The
FILE_ID_HEREpart is the file ID you need.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.