Figma MCP Proxy Server
Provides secure access to Figma designs through Cursor IDE using a personal access token, enabling developers to list files, export assets, extract design tokens, and search designs without requiring individual Figma accounts.
README
Figma MCP Proxy Server
A Model Context Protocol (MCP) server that provides secure access to your Figma designs through Cursor IDE. This server authenticates with Figma using your personal access token and exposes your designs to your development team without requiring individual Figma accounts.
Architecture
Developer Cursor IDEs ā Your MCP Server (Cloud) ā Figma API (Your Account)
- Your developers connect their Cursor IDE to your hosted MCP server
- Your MCP server authenticates with Figma using your credentials
- All Figma requests go through your server (developers never touch Figma directly)
Features
Resources
figma://files- List all your Figma filesfigma://file/{file_id}- Get specific file content (frames, layers, components)figma://team/{team_id}- List files and projects for a teamfigma://project/{project_id}- Project information
Tools
export_asset- Export images from designs (PNG, JPG, SVG, PDF)get_design_tokens- Extract design tokens (colors, typography, spacing)search_files- Search across your Figma files by nameget_components- List components from a specific fileget_styles- Get published styles (colors, text styles)
Setup Instructions
1. Get Your Figma Personal Access Token
- Log in to your Figma account
- Click your profile picture ā Settings
- Go to Account tab ā Personal Access Tokens
- Click Generate new token
- Give it a name (e.g., "MCP Proxy Server")
- Copy the token (you won't see it again!)
2. Deploy the Server
Option A: Deploy to Render with Docker (Recommended)
- Fork or push this repository to GitHub
- Sign up at render.com
- Create a new Web Service
- Connect your GitHub repository
- Configure:
- Runtime:
Docker - Dockerfile Path:
Dockerfile(auto-detected) - Environment Variables:
FIGMA_ACCESS_TOKEN= (your token from step 1)NODE_ENV=productionPORT=10000(Render sets this automatically)
- Runtime:
- Deploy!
š Detailed steps: See DEPLOYMENT.md for complete walkthrough.
Your server URL will be: https://your-service-name.onrender.com
Option B: Deploy to Railway
- Push to GitHub
- Sign up at railway.app
- Create new project from GitHub repo
- Add environment variable:
FIGMA_ACCESS_TOKEN - Deploy!
Option C: Deploy to Other Platforms
Any Node.js hosting platform works:
- Fly.io: Use
fly.tomlconfiguration - AWS ECS/Lambda: Containerize and deploy
- DigitalOcean App Platform: Similar to Render
- Heroku: Standard Node.js buildpack
3. Configure Cursor IDE (For Each Developer)
Each developer needs to add the MCP server to their Cursor configuration:
Mac
Edit: ~/Library/Application Support/Cursor/mcp.json
Windows
Edit: %APPDATA%\Cursor\mcp.json
Linux
Edit: ~/.config/Cursor/mcp.json
Configuration:
{
"mcpServers": {
"figma-proxy": {
"transport": "sse",
"url": "https://your-service-name.onrender.com/sse"
}
}
}
Replace your-service-name.onrender.com with your actual server URL.
Note: If your server requires authentication, you may need to add headers or use a different transport method.
4. Restart Cursor
After adding the configuration, restart Cursor IDE completely. The MCP server should now be available.
Usage Examples
Once configured, developers can use natural language in Cursor to interact with Figma:
List Files
"Show me all my Figma files"
"List the files in my Figma account"
View File Content
"Show me the design for file [file-id]"
"Get the components from file [file-id]"
Export Assets
"Export the logo from file [file-id] as PNG"
"Get a 2x scale export of node [node-id]"
Get Design Tokens
"Extract design tokens from file [file-id]"
"Show me the colors and typography from [file-id]"
Search Files
"Search for files named 'dashboard'"
"Find all files with 'mobile' in the name"
Local Development
To run the server locally for testing:
# Install dependencies
npm install
# Create .env file
echo "FIGMA_ACCESS_TOKEN=your_token_here" > .env
echo "PORT=3000" >> .env
# Build
npm run build
# Run
npm start
The server will be available at http://localhost:3000/sse
API Endpoints
GET /health- Health check endpointGET /sse- SSE endpoint for MCP protocol (used by Cursor)POST /message- Message endpoint for client-to-server communication
Security Considerations
- Token Security: Your Figma token is stored in environment variables only - never commit it to git
- HTTPS: Always use HTTPS in production (Render provides this automatically)
- Access Control: Currently, anyone with the server URL can access it. For production, consider:
- Adding API key authentication
- IP whitelisting
- User authentication
Rate Limits
Figma API has rate limits:
- 120 requests per minute
- 24,000 requests per day
The server implements caching for file lists (5-minute cache) to reduce API calls.
Troubleshooting
Server won't start
- Check that
FIGMA_ACCESS_TOKENis set correctly - Verify Node.js version (requires Node 20+)
- Check server logs for errors
Cursor can't connect
- Verify the server URL is correct
- Check that the server is running (
/healthendpoint) - Ensure Cursor config file syntax is valid JSON
- Restart Cursor completely after config changes
No files showing
- Verify your Figma token has access to files
- Check server logs for API errors
- Test the Figma API directly with your token
Project Structure
proxy_mcp_fogma/
āāā src/
ā āāā index.ts # Main server with HTTP/SSE transport
ā āāā figma-client.ts # Figma API wrapper
ā āāā types.ts # TypeScript types
ā āāā handlers/
ā ā āāā resources.ts # Resource handlers
ā ā āāā tools.ts # Tool handlers
ā āāā transport/
ā āāā sse-transport.ts # SSE transport implementation
āāā package.json
āāā tsconfig.json
āāā render.yaml # Render deployment config
āāā README.md
License
MIT
Support
For issues or questions:
- Check the troubleshooting section
- Review server logs
- Test Figma API access directly
- Verify Cursor MCP configuration
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.