Surfline MCP Server
Enables access to comprehensive surf forecasts from Surfline including current conditions, swell analysis, forecaster insights, tides, and timing information for Santa Cruz surf spots. Provides detailed 8-hour forecasts with expert observations through secure Google OAuth authentication.
README
Surfline MCP Server
A Model Context Protocol (MCP) server that provides comprehensive surf forecasts from Surfline's API. Access detailed surf conditions, swell analysis, forecaster insights, tides, and more directly through Claude or any MCP-compatible client.
Features
🌊 Comprehensive Surf Data
- Current conditions for 11 Santa Cruz spots (easily extensible to other regions)
- Detailed swell breakdown (height, period, direction, power for each swell component)
- 8-hour hourly forecasts showing how conditions evolve
- Expert forecaster observations with AM/PM specific timing advice
- Wind conditions (speed, direction, offshore/onshore classification)
- Quality ratings (1-5 stars)
🌅 Timing Information
- Sunrise, sunset, dawn, and dusk times
- Tide schedule with high/low times and heights
- All times properly converted to Pacific timezone
🔐 Secure Authentication
- Google OAuth integration for secure access
- Works seamlessly with claude.ai web and mobile
- No Surfline API keys required (uses public endpoints)
Quick Start
Prerequisites
- Node.js 18+
- A Cloudflare account (free tier works)
- A Google Cloud project for OAuth (free)
Installation
-
Clone and install dependencies: ```bash cd surfline-mcp-server npm install ```
-
Set up Google OAuth:
- Go to Google Cloud Console
- Create a new OAuth 2.0 Client ID (Web application type)
- Add authorized redirect URIs:
- `https://your-worker-name.your-subdomain.workers.dev/callback`
- `https://claude.ai/api/mcp/auth_callback`
- Note your Client ID and Client Secret
-
Create a KV namespace: ```bash npx wrangler kv namespace create OAUTH_KV ``` Update `wrangler.jsonc` with the returned KV ID.
-
Set secrets: ```bash echo 'YOUR_GOOGLE_CLIENT_ID' | npx wrangler secret put GOOGLE_CLIENT_ID echo 'YOUR_GOOGLE_CLIENT_SECRET' | npx wrangler secret put GOOGLE_CLIENT_SECRET echo $(openssl rand -hex 32) | npx wrangler secret put COOKIE_ENCRYPTION_KEY ```
-
Deploy: ```bash npm run deploy ```
Connect to Claude
- Go to claude.ai
- Navigate to Settings → Integrations
- Add your deployed worker URL: `https://your-worker-name.your-subdomain.workers.dev/mcp`
- Authenticate with Google
- Ask Claude: "How's the surf in Santa Cruz?"
Available Tools
`get_complete_surf_report`
Primary tool - Returns everything in one call:
- Forecaster notes with expert observations
- Sunrise/sunset times
- Tide schedule
- Current conditions for all spots
- Swell breakdown
- 8-hour forecasts
Secondary Tools
Individual data fetchers available if you need specific information:
- `get_surf_forecast` - Basic spot conditions only
- `get_forecaster_notes` - Human observations only
- `get_tides` - Tide information only
- `get_best_spot` - Ranked recommendations
Spots Covered
North County: Davenport, Waddell Creek, Four Mile, Three Mile
Central: Steamer Lane, Cowells, 26th Ave
East Side: Pleasure Point, Jack's, The Hook
South: Capitola
Data Source
This server uses Surfline's undocumented public API endpoints - the same ones their website uses. No API keys or authentication required for basic forecast data. The endpoints have been stable for years and are widely used by the surf community.
Important: Webcams and premium features are not available through these endpoints.
Extending to Other Regions
To add more spots, edit `src/index.ts` and add to the `SANTA_CRUZ_SPOTS` object:
```typescript const SANTA_CRUZ_SPOTS: Record<string, string> = { "Your Spot Name": "spotIdFromSurfline", // ... }; ```
Find spot IDs by inspecting network requests on surfline.com.
Architecture
- Cloudflare Workers: Serverless hosting (free tier: 100k requests/day)
- Durable Objects: OAuth state management
- KV Storage: Token persistence
- Google OAuth: Secure authentication
- MCP Protocol: Standard tool interface for AI assistants
Development
Run locally: ```bash npm run dev ```
The server will be available at `http://localhost:8788`
Test with MCP Inspector: ```bash npx @modelcontextprotocol/inspector ```
License
MIT
Acknowledgments
- Surfline for providing accessible surf forecast data
- Cloudflare for the MCP and OAuth libraries
- The surf community for documenting the API endpoints
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.