graphql-to-mcp
Turn any GraphQL API into MCP tools. Zero config, zero code. Auto-introspection, flat InputObject schemas for better LLM accuracy, smart truncation, retry logic with exponential backoff.
README
graphql-to-mcp
Turn any GraphQL API into MCP tools — zero config, zero code.
Point graphql-to-mcp at a GraphQL endpoint and it auto-generates one MCP tool per query/mutation via introspection. Works with Claude Desktop, Cursor, Windsurf, and any MCP client.
Quick Start
Try it now — no install needed:
npx graphql-to-mcp https://countries.trevorblades.com/graphql
Or add to Claude Desktop / Cursor config:
{
"mcpServers": {
"countries": {
"command": "npx",
"args": ["-y", "graphql-to-mcp", "https://countries.trevorblades.com/graphql"]
}
}
}
That's it. Claude can now query countries, continents, and languages.
Features
- Zero config — just provide a GraphQL endpoint URL
- Auto-introspection — discovers all queries and mutations automatically
- Flat parameter schemas — nested
inputobjects are flattened for better LLM accuracy - Smart truncation — large responses are intelligently pruned (array slicing + depth limiting)
- Auth support — Bearer tokens, API keys (header or query)
- Retry logic — automatic retries on 429/5xx with exponential backoff
- Include/exclude filters — expose only the operations you want
Usage
CLI
# Public API (no auth)
npx graphql-to-mcp https://countries.trevorblades.com/graphql
# With bearer token
npx graphql-to-mcp https://api.github.com/graphql --bearer ghp_xxxxx
# With API key
npx graphql-to-mcp https://api.example.com/graphql --api-key "X-API-Key:your-key:header"
# Filter operations
npx graphql-to-mcp https://api.example.com/graphql --include "get*" --exclude "internal*"
# With prefix (avoid name collisions when using multiple APIs)
npx graphql-to-mcp https://api.example.com/graphql --prefix myapi
Claude Desktop / Cursor Config
{
"mcpServers": {
"github": {
"command": "npx",
"args": [
"-y", "graphql-to-mcp",
"https://api.github.com/graphql",
"--bearer", "ghp_xxxxx",
"--prefix", "github"
]
}
}
}
Programmatic
import { createServer } from "graphql-to-mcp";
const server = await createServer({
endpoint: "https://api.example.com/graphql",
auth: { type: "bearer", token: "xxx" },
include: ["getUser", "listUsers"],
});
How It Works
- Introspect — Fetches the GraphQL schema via introspection query
- Flatten — Nested
InputObjecttypes are flattened into simple key-value parameters (e.g.,input.name→input_name) - Generate — Each query/mutation becomes an MCP tool with a flat JSON Schema
- Execute — When an LLM calls a tool, the flat args are reconstructed into proper GraphQL variables and sent to your endpoint
Why Flat Schemas?
LLMs are significantly better at filling flat key-value parameters than deeply nested JSON objects. By flattening InputObject types, we get:
- Higher accuracy in parameter filling
- Fewer hallucinated nested structures
- Better compatibility across different LLM providers
Options
| Option | Description | Default |
|---|---|---|
--bearer <token> |
Bearer token auth | — |
--api-key <name:value:in> |
API key auth | — |
-H, --header <name:value> |
Custom header (repeatable) | — |
--include <pattern> |
Include only matching operations | all |
--exclude <pattern> |
Exclude matching operations | none |
--prefix <name> |
Tool name prefix | — |
--timeout <ms> |
Request timeout | 30000 |
--max-retries <n> |
Retry on 429/5xx | 3 |
--transport <stdio|sse> |
MCP transport | stdio |
Smart Truncation
GraphQL APIs can return large payloads that overwhelm LLM context windows. graphql-to-mcp automatically:
- Slices arrays to 20 items (with metadata showing total count)
- Prunes depth beyond 5 levels (with object/array summaries)
- Hard truncates at 50K characters as a safety net
Use with REST APIs Too
Pair with mcp-openapi to give Claude access to both REST and GraphQL APIs:
{
"mcpServers": {
"github-graphql": {
"command": "npx",
"args": ["-y", "graphql-to-mcp", "https://api.github.com/graphql", "--bearer", "ghp_xxx", "--prefix", "gh"]
},
"petstore-rest": {
"command": "npx",
"args": ["-y", "mcp-openapi", "https://petstore3.swagger.io/api/v3/openapi.json"]
}
}
}
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.