Surfline MCP Server

Surfline MCP Server

Enables access to comprehensive surf forecasts from Surfline including current conditions, swell analysis, forecaster insights, tides, and timing information for Santa Cruz surf spots. Provides detailed 8-hour forecasts with expert observations through secure Google OAuth authentication.

Category
Visit Server

README

Surfline MCP Server

A Model Context Protocol (MCP) server that provides comprehensive surf forecasts from Surfline's API. Access detailed surf conditions, swell analysis, forecaster insights, tides, and more directly through Claude or any MCP-compatible client.

Features

🌊 Comprehensive Surf Data

  • Current conditions for 11 Santa Cruz spots (easily extensible to other regions)
  • Detailed swell breakdown (height, period, direction, power for each swell component)
  • 8-hour hourly forecasts showing how conditions evolve
  • Expert forecaster observations with AM/PM specific timing advice
  • Wind conditions (speed, direction, offshore/onshore classification)
  • Quality ratings (1-5 stars)

🌅 Timing Information

  • Sunrise, sunset, dawn, and dusk times
  • Tide schedule with high/low times and heights
  • All times properly converted to Pacific timezone

🔐 Secure Authentication

  • Google OAuth integration for secure access
  • Works seamlessly with claude.ai web and mobile
  • No Surfline API keys required (uses public endpoints)

Quick Start

Prerequisites

  • Node.js 18+
  • A Cloudflare account (free tier works)
  • A Google Cloud project for OAuth (free)

Installation

  1. Clone and install dependencies: ```bash cd surfline-mcp-server npm install ```

  2. Set up Google OAuth:

    • Go to Google Cloud Console
    • Create a new OAuth 2.0 Client ID (Web application type)
    • Add authorized redirect URIs:
      • `https://your-worker-name.your-subdomain.workers.dev/callback`
      • `https://claude.ai/api/mcp/auth_callback`
    • Note your Client ID and Client Secret
  3. Create a KV namespace: ```bash npx wrangler kv namespace create OAUTH_KV ``` Update `wrangler.jsonc` with the returned KV ID.

  4. Set secrets: ```bash echo 'YOUR_GOOGLE_CLIENT_ID' | npx wrangler secret put GOOGLE_CLIENT_ID echo 'YOUR_GOOGLE_CLIENT_SECRET' | npx wrangler secret put GOOGLE_CLIENT_SECRET echo $(openssl rand -hex 32) | npx wrangler secret put COOKIE_ENCRYPTION_KEY ```

  5. Deploy: ```bash npm run deploy ```

Connect to Claude

  1. Go to claude.ai
  2. Navigate to Settings → Integrations
  3. Add your deployed worker URL: `https://your-worker-name.your-subdomain.workers.dev/mcp`
  4. Authenticate with Google
  5. Ask Claude: "How's the surf in Santa Cruz?"

Available Tools

`get_complete_surf_report`

Primary tool - Returns everything in one call:

  • Forecaster notes with expert observations
  • Sunrise/sunset times
  • Tide schedule
  • Current conditions for all spots
  • Swell breakdown
  • 8-hour forecasts

Secondary Tools

Individual data fetchers available if you need specific information:

  • `get_surf_forecast` - Basic spot conditions only
  • `get_forecaster_notes` - Human observations only
  • `get_tides` - Tide information only
  • `get_best_spot` - Ranked recommendations

Spots Covered

North County: Davenport, Waddell Creek, Four Mile, Three Mile
Central: Steamer Lane, Cowells, 26th Ave
East Side: Pleasure Point, Jack's, The Hook
South: Capitola

Data Source

This server uses Surfline's undocumented public API endpoints - the same ones their website uses. No API keys or authentication required for basic forecast data. The endpoints have been stable for years and are widely used by the surf community.

Important: Webcams and premium features are not available through these endpoints.

Extending to Other Regions

To add more spots, edit `src/index.ts` and add to the `SANTA_CRUZ_SPOTS` object:

```typescript const SANTA_CRUZ_SPOTS: Record<string, string> = { "Your Spot Name": "spotIdFromSurfline", // ... }; ```

Find spot IDs by inspecting network requests on surfline.com.

Architecture

  • Cloudflare Workers: Serverless hosting (free tier: 100k requests/day)
  • Durable Objects: OAuth state management
  • KV Storage: Token persistence
  • Google OAuth: Secure authentication
  • MCP Protocol: Standard tool interface for AI assistants

Development

Run locally: ```bash npm run dev ```

The server will be available at `http://localhost:8788`

Test with MCP Inspector: ```bash npx @modelcontextprotocol/inspector ```

License

MIT

Acknowledgments

  • Surfline for providing accessible surf forecast data
  • Cloudflare for the MCP and OAuth libraries
  • The surf community for documenting the API endpoints

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured