ChatGPT Apps SDK Next.js Starter

ChatGPT Apps SDK Next.js Starter

A minimal MCP server demonstrating how to build ChatGPT-compatible applications using Next.js with widget rendering capabilities. Provides a starter template for integrating Next.js applications with the ChatGPT Apps SDK through the Model Context Protocol.

Category
Visit Server

README

ChatGPT Apps SDK Next.js Starter

A minimal Next.js application demonstrating how to build an OpenAI Apps SDK compatible MCP server with widget rendering in ChatGPT.

Overview

This project shows how to integrate a Next.js application with the ChatGPT Apps SDK using the Model Context Protocol (MCP). It includes a working MCP server that exposes tools and resources that can be called from ChatGPT, with responses rendered natively in ChatGPT.

Key Components

1. MCP Server Route (app/mcp/route.ts)

The core MCP server implementation that exposes tools and resources to ChatGPT.

Key features:

  • Tool registration with OpenAI-specific metadata
  • Resource registration that serves HTML content for iframe rendering
  • Cross-linking between tools and resources via templateUri

OpenAI-specific metadata:

{
  "openai/outputTemplate": widget.templateUri,      // Links to resource
  "openai/toolInvocation/invoking": "Loading...",   // Loading state text
  "openai/toolInvocation/invoked": "Loaded",        // Completion state text
  "openai/widgetAccessible": false,                 // Widget visibility
  "openai/resultCanProduceWidget": true            // Enable widget rendering
}

Full configuration options: OpenAI Apps SDK MCP Documentation

2. Asset Configuration (next.config.ts)

Critical: Set assetPrefix to ensure /_next/ static assets are fetched from the correct origin:

const nextConfig: NextConfig = {
  assetPrefix: baseURL,  // Prevents 404s on /_next/ files in iframe
};

Without this, Next.js will attempt to load assets from the iframe's URL, causing 404 errors.

3. CORS Middleware (middleware.ts)

Handles browser OPTIONS preflight requests required for cross-origin RSC (React Server Components) fetching during client-side navigation:

export function middleware(request: NextRequest) {
  if (request.method === "OPTIONS") {
    // Return 204 with CORS headers
  }
  // Add CORS headers to all responses
}

4. SDK Bootstrap (app/layout.tsx)

The <NextChatSDKBootstrap> component patches browser APIs to work correctly within the ChatGPT iframe:

What it patches:

  • history.pushState / history.replaceState - Prevents full-origin URLs in history
  • window.fetch - Rewrites same-origin requests to use the correct base URL
  • <html> attribute observer - Prevents ChatGPT from modifying the root element

Required configuration:

<html lang="en" suppressHydrationWarning>
  <head>
    <NextChatSDKBootstrap baseUrl={baseURL} />
  </head>
  <body>{children}</body>
</html>

Note: suppressHydrationWarning is currently required because ChatGPT modifies the initial HTML before the Next.js app hydrates, causing hydration mismatches.

Getting Started

Installation

npm install
# or
pnpm install

Development

npm run dev
# or
pnpm dev

Open http://localhost:3000 to see the app.

Testing the MCP Server

The MCP server is available at:

http://localhost:3000/mcp

Connecting from ChatGPT

  1. Deploy your app to Vercel
  2. In ChatGPT, navigate to Settings → Connectors → Create and add your MCP server URL with the /mcp path (e.g., https://your-app.vercel.app/mcp)

Note: Connecting MCP servers to ChatGPT requires developer mode access. See the connection guide for setup instructions.

Project Structure

app/
├── mcp/
│   └── route.ts          # MCP server with tool/resource registration
├── layout.tsx            # Root layout with SDK bootstrap
├── page.tsx              # Homepage content
└── globals.css           # Global styles
middleware.ts             # CORS handling for RSC
next.config.ts            # Asset prefix configuration

How It Works

  1. Tool Invocation: ChatGPT calls a tool registered in app/mcp/route.ts
  2. Resource Reference: Tool response includes templateUri pointing to a registered resource
  3. Widget Rendering: ChatGPT fetches the resource HTML and renders it in an iframe
  4. Client Hydration: Next.js hydrates the app inside the iframe with patched APIs
  5. Navigation: Client-side navigation uses patched fetch to load RSC payloads

Learn More

Deployment

This project is designed to work seamlessly with Vercel deployment. The baseUrl.ts configuration automatically detects Vercel environment variables and sets the correct asset URLs.

Deploy with Vercel

The configuration automatically handles:

  • Production URLs via VERCEL_PROJECT_PRODUCTION_URL
  • Preview/branch URLs via VERCEL_BRANCH_URL
  • Asset prefixing for correct resource loading in iframes

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured