TanStack MCP Server
Wraps the TanStack CLI to provide programmatic access to documentation, library listings, and project scaffolding for the TanStack ecosystem. It enables AI assistants to search documentation and create new TanStack applications through the Model Context Protocol.
README
@g7aro/tanstack-mcp
MCP server that wraps the TanStack CLI to provide programmatic access to TanStack documentation, libraries, add-ons, ecosystem partners, and project scaffolding.
Built as a drop-in replacement after TanStack removed the built-in MCP server from @tanstack/cli.
Tools
| Tool | Description |
|---|---|
listTanStackAddOns |
List all available TanStack Start add-ons for a given framework |
getAddOnDetails |
Get detailed info about a specific add-on (files, deps, options, routes) |
createTanStackApplication |
Scaffold a new TanStack Start app with add-ons and options |
tanstack_list_libraries |
List all TanStack libraries with descriptions and links |
tanstack_doc |
Fetch the full content of a TanStack documentation page |
tanstack_search_docs |
Search across TanStack documentation |
tanstack_ecosystem |
List TanStack ecosystem partners with optional filters |
Quick Install
Auto-detect installed AI clients and register the MCP server in one command:
npx @g7aro/tanstack-mcp --install
This will detect and configure all supported clients on your machine.
Options
npx @g7aro/tanstack-mcp --install # Interactive — pick which clients
npx @g7aro/tanstack-mcp --install --all # Install into all detected clients
npx @g7aro/tanstack-mcp --install cursor codex # Install into specific clients only
npx @g7aro/tanstack-mcp --uninstall # Remove from all clients
Supported clients
| Client | Detection | Config method |
|---|---|---|
| Claude Code | claude CLI |
claude mcp add |
| Codex (OpenAI) | codex CLI |
codex mcp add |
| Cursor | ~/.cursor/ dir |
~/.cursor/mcp.json |
| Windsurf | ~/.windsurf/ dir |
~/.windsurf/mcp.json |
| Trae | ~/.trae/ dir |
~/.trae/mcp.json |
| Antigravity | ~/.gemini/antigravity/ dir |
mcp_config.json |
| OpenCode | ~/.config/opencode/ dir |
opencode.json |
| Zed | ~/.config/zed/settings.json |
settings.json |
| VS Code (Copilot) | settings.json / code CLI |
settings.json |
Manual Setup
If you prefer to configure manually, add to your client's MCP config:
{
"mcpServers": {
"tanstack": {
"command": "npx",
"args": ["-y", "@g7aro/tanstack-mcp"]
}
}
}
Tools
| Tool | Description |
|---|---|
listTanStackAddOns |
List all available TanStack Start add-ons for a given framework |
getAddOnDetails |
Get detailed info about a specific add-on (files, deps, options, routes) |
createTanStackApplication |
Scaffold a new TanStack Start app with add-ons and options |
tanstack_list_libraries |
List all TanStack libraries with descriptions and links |
tanstack_doc |
Fetch the full content of a TanStack documentation page |
tanstack_search_docs |
Search across TanStack documentation |
tanstack_ecosystem |
List TanStack ecosystem partners with optional filters |
Prerequisites
- Node.js >= 18
npxavailable in PATH (ships with npm)
How it works
Each MCP tool maps to a @tanstack/cli command with --json output:
listTanStackAddOns -> tanstack create --list-add-ons --framework <f> --json
getAddOnDetails -> tanstack create --addon-details <id> --framework <f> --json
createTanStackApplication -> tanstack create <name> --framework <f> --add-ons <a,b> ...
tanstack_list_libraries -> tanstack libraries --json
tanstack_doc -> tanstack doc <library> <path> --json
tanstack_search_docs -> tanstack search-docs "<query>" --json
tanstack_ecosystem -> tanstack ecosystem --json
The server spawns npx @tanstack/cli for each invocation, parses the JSON output, and returns it through the MCP protocol over stdio.
Development
npm install
npm run build # compile TypeScript -> dist/
npm start # run the server (stdio)
npm run dev # watch mode
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.