NTFY MCP Server
Enables bidirectional communication between AI agents and users through ntfy.sh push notifications, allowing agents to send messages and wait for user responses in asynchronous chat workflows.
README
NTFY MCP Server
An MCP (Model Context Protocol) server that enables AI agents to send and receive messages through ntfy.sh with real-time subscriptions. Perfect for building AI agents that can communicate via push notifications and enabling bidirectional chat workflows.
Intended Use Case
This server was designed to enable bidirectional communication between you and AI agents through a shared ntfy topic. The typical workflow is:
- You send a message to the ntfy topic (via the ntfy.sh web interface, mobile app, or API)
- The AI agent receives it through the
wait-and-read-inboxtool orntfy://inboxresource - The AI agent responds using the
send-ntfytool - You receive the response as a push notification on your device
- The cycle continues - you can reply, and the AI will wait for your next message
This creates an asynchronous chat interface where you can communicate with AI agents at your own pace, receiving push notifications when they respond, and they can wait for your replies even if you take hours or days to respond.
Important for Chat Workflows: The MCP protocol has a client-side timeout of approximately 60 seconds. When using wait-and-read-inbox for chat, you may need to:
- Tell the AI to retry the
wait-and-read-inboxcall if it times out without receiving a message - Configure the AI to automatically retry after timeouts when waiting for your response
- Use a prompt that instructs the AI to keep waiting until it gets a response, retrying as needed
The AI agent can keep retrying wait-and-read-inbox indefinitely until it receives your message, making this suitable for long-running conversations where responses may take hours or days.
Features
- šØ Send Messages: Publish messages to ntfy topics with optional title, priority, tags, and attachments
- š¬ Real-time Subscriptions: Maintains persistent connections to receive messages instantly
- š Topic Management: Change topics on the fly without restarting
- š Message Caching: Keeps recent messages in memory and on disk
- š Authentication: Supports bearer tokens and basic auth for protected topics
- ā” Zero Configuration: Works out of the box with public ntfy.sh
Installation
Via npm (for MCP clients)
npm install -g nfty-mcp-server
Via npx (no installation needed)
npx nfty-mcp-server
Quick Start
1. Get a topic
Visit ntfy.sh and create a topic (or use an existing one). Topics are public by default, so choose a unique name.
2. Configure your MCP client
For Cursor/VS Code
Add to your MCP settings (typically ~/.cursor/mcp.json or C:\Users\<user>\.cursor\mcp.json):
{
"mcpServers": {
"nfty": {
"command": "npx",
"args": ["-y", "--yes", "nfty-mcp-server"],
"env": {
"NTFY_TOPIC": "your-topic-name",
"NTFY_BASE_URL": "https://ntfy.sh"
}
}
}
}
Note: NTFY_TOPIC is required. Set it in the env section of mcp.json.
Important: The --yes flag ensures npx installs the package in its cache directory instead of your project directory, preventing dependencies from being installed in your project's node_modules.
For Claude Desktop
Add to claude_desktop_config.json:
{
"mcpServers": {
"nfty": {
"command": "npx",
"args": ["-y", "--yes", "nfty-mcp-server"],
"env": {
"NTFY_TOPIC": "your-topic-name",
"NTFY_BASE_URL": "https://ntfy.sh"
}
}
}
}
Note: NTFY_TOPIC is required. Set it in the env section of mcp.json.
Important: The --yes flag ensures npx installs the package in its cache directory instead of your project directory, preventing dependencies from being installed in your project's node_modules.
3. Restart your MCP client
Restart Cursor, VS Code, or Claude Desktop to load the MCP server.
Usage
Available Tools
send-ntfy
Publish a message to the configured ntfy topic (set in mcp.json).
Parameters:
message(required): The message bodytitle(optional): Message titlepriority(optional): Priority level 1-5 (1=min, 3=default, 5=max)tags(optional): Array of tags/emojisattachUrl(optional): URL to attach
Example:
{
"message": "Hello from AI agent!",
"title": "AI Notification",
"priority": 4,
"tags": ["robot", "ai"]
}
set-ntfy-topic
Change the ntfy topic for this session (no restart needed).
Parameters:
topic(required): New topic namebaseUrl(optional): New base URL
wait-and-read-inbox
Wait for new messages on the configured topic (set in mcp.json) and return when a new message arrives. Does not return until at least one new message is received. Uses the existing subscription.
Note: The MCP protocol has a ~60s client-side timeout that cannot be controlled from the server, but this tool will wait as long as possible within that limit.
Parameters:
since(optional): Cursor to filter messages after this pointsinceTime(optional): Unix timestamp - filter messages with time >= sinceTimesinceNow(optional, default: true): If true (default), only returns messages sent after this call starts. If false, returns all messages since the cursor.
Timeout Behavior for Chat Workflows:
When using this for bidirectional chat, the tool may timeout after ~60 seconds if no message arrives. To handle this:
- Configure the AI to retry: Tell the AI agent to automatically retry
wait-and-read-inboxwhen it times out if it's waiting for your response - Use a prompt: Create a prompt that instructs the AI to "keep waiting for a response, retrying wait-and-read-inbox if it times out"
- Manual retry: You can manually ask the AI to try again if it times out
The AI can keep retrying indefinitely until it receives your message, making this suitable for long-running conversations.
Example:
{
"sinceNow": true
}
Available Resources
ntfy://inbox
Read recent messages for the configured topic. Returns JSON:
{
"topic": "your-topic",
"baseUrl": "https://ntfy.sh",
"messages": [
{
"id": "message-id",
"time": 1234567890,
"title": "Message Title",
"message": "Message body",
"priority": 3,
"tags": ["tag1"],
"topic": "your-topic"
}
]
}
Configuration
Data Storage
The server stores its data files (logs, message cache, lock files) in a dedicated directory to avoid cluttering your project:
- Default location:
~/.nfty-mcp-server/(orC:\Users\<user>\.nfty-mcp-server\on Windows) - Custom location: Set
NTFY_DATA_DIRenvironment variable to use a different directory
Files stored:
nfty-messages.json- Cached messagesnfty-debug.log- Debug logsnfty-process.log- Process management logsnfty.lock- Lock file to prevent multiple instances
Note: The server will automatically create this directory if it doesn't exist. Files are never created in your project root or in node_modules.
Environment Variables
| Variable | Description | Default |
|---|---|---|
NTFY_TOPIC |
Topic to send/receive messages (required) | (required) |
NTFY_BASE_URL |
ntfy server URL | https://ntfy.sh |
NTFY_AUTH_TOKEN |
Bearer token for protected topics | (optional) |
NTFY_USERNAME |
Username for basic auth | (optional) |
NTFY_PASSWORD |
Password for basic auth | (optional) |
NTFY_SINCE |
Initial backlog cursor | 1h |
NTFY_FETCH_TIMEOUT_MS |
Fetch timeout in milliseconds | 10000 |
NTFY_CLEAN_ON_STARTUP |
Clear logs/cache on startup | true |
NTFY_KILL_EXISTING |
Kill existing server instances | true |
NTFY_DATA_DIR |
Directory for data files (logs, cache, lock) | ~/.nfty-mcp-server/ |
NTFY_CACHE_FILE |
Custom path for message cache file | {NTFY_DATA_DIR}/nfty-messages.json |
CLI Arguments
You can also pass configuration via CLI arguments:
npx nfty-mcp-server --topic my-topic --base-url https://ntfy.sh --auth-token your-token
Available arguments:
--topic: Topic name--base-urlor--server: Base URL--auth-token: Bearer token--username: Username for basic auth--password: Password for basic auth--since: Initial backlog cursor--log-incoming: Log all incoming messages
How It Works
- Subscription: When the server starts, it automatically creates a persistent HTTP connection to the ntfy topic
- Real-time Delivery: Messages arrive in real-time through the open connection
- Message Caching: Recent messages (up to 50) are kept in memory and persisted to disk
- No Polling: The connection stays open indefinitely - no need to poll for messages
Use Cases
- š¬ Bidirectional Chat: Chat with AI agents asynchronously via push notifications - send messages when convenient, receive responses as notifications
- š¤ AI Agent Communication: Enable AI agents to send and receive notifications
- š± Push Notifications: Send push notifications from AI workflows
- š Alert Systems: Create alert systems that AI agents can interact with
- š¬ Message Queues: Use as a simple message queue for AI agent coordination
- š Long-running Workflows: Enable workflows where the AI waits for human input that may take hours or days
Examples
Example: AI Agent Sends Notification
// AI agent uses the send-ntfy tool (topic is configured in mcp.json)
{
"message": "Task completed successfully!",
"title": "Task Status",
"priority": 4
}
Example: Chat Workflow
// AI agent sends a question (topic is configured in mcp.json)
send-ntfy({
message: "What is 2+2?",
title: "Math Question"
})
// Then waits for response (uses topic from mcp.json)
// Note: If this times out (~60s), the AI should retry until it gets a response
wait-and-read-inbox({
sinceNow: true
})
For reliable chat workflows, configure your AI with a prompt like:
"When waiting for a user response via wait-and-read-inbox, if the call times out without receiving a message, automatically retry the wait-and-read-inbox call. Keep retrying until you receive a response from the user."
This ensures the AI will continue waiting for your reply even if individual calls timeout.
Development
Local Development
# Clone the repository
git clone https://github.com/harshwasan/NFTY-MCP.git
cd NFTY-MCP
# Install dependencies
npm install
# Run tests
npm test
# Run in development mode
npm run dev
Project Structure
NFTY-MCP/
āāā src/
ā āāā server.js # Main MCP server implementation
āāā tests/
ā āāā server.test.js # Test suite
āāā package.json
āāā README.md
Troubleshooting
Messages not arriving
- Check that
NTFY_TOPICis set correctly - Verify the topic exists on ntfy.sh
- Check the debug log at
src/nfty-debug.log - Ensure the subscription is running (check logs)
Connection issues
- Verify network connectivity to ntfy.sh
- Check if using a custom
NTFY_BASE_URLthat it's accessible - Review authentication settings if using protected topics
Rate limiting
- The server automatically handles rate limiting with backoff
- Check
NTFY_HYDRATE_BACKOFF_MSif you need to adjust backoff timing
License
MIT
Contributing
Contributions welcome! Please open an issue or submit a pull request.
Note
This is a hobby personal project I made using AI and vibe coding - built organically through experimentation and iteration with AI assistance. šØš¤
Why Runkit Shows "Unavailable"
Runkit may show this package as unavailable because:
- CLI Tool: This is primarily a CLI tool designed to run as an MCP server, not a library with exportable functions
- No Default Export: The package doesn't export functions that can be easily imported and used in Runkit's sandbox environment
- MCP Protocol: It's designed to communicate via the MCP protocol with MCP clients (like Cursor, Claude Desktop), not to be executed directly in a browser-like environment
This is expected behavior - the package is intended to be used as an MCP server, not as a runnable script in Runkit. Use it via npx or install it globally as described in the Installation section.
Links
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.