SendPulse Chatbots MCP Server

SendPulse Chatbots MCP Server

Enables interaction with SendPulse Chatbots API to manage chatbot accounts, retrieve bot information, and access dialog conversations. Supports flexible authentication through API credentials or OAuth tokens for comprehensive chatbot management through natural language.

Category
Visit Server

README

MCP Server for SendPulse Chatbots

This project is an implementation of a Model Context Protocol (MCP) server designed to work with the SendPulse Chatbots API. It allows Large Language Models (LLMs) like those from OpenAI to interact with the SendPulse API through a standardized set of tools.

This server is built with TypeScript and runs on Node.js using the Express framework.

Features

The server exposes a combination of global and universal, channel-specific tools to the LLM.

Global Tools

These tools provide general, account-wide information.

  • get_account_info: Returns information about the current SendPulse account, including pricing plan, message counts, bots, contacts, etc.
  • get_bots_list: Returns a list of all connected chatbots with details for each.
  • get_dialogs: Returns a list of dialogs from all channels, with support for pagination and sorting.

Universal Tools

These tools perform actions on specific channels. They require a channel parameter to be specified.

  • send_message: Sends a text message to a contact.
    • channel: The channel to use. Supported values: whatsapp, telegram, instagram, messenger, livechat, viber.
    • contact_id: The ID of the recipient.
    • text: The message content.

Authentication

The server supports two flexible methods for authenticating requests to the SendPulse API, which are handled on a per-session basis.

Method 1: API ID & Secret (Recommended)

The client can provide SendPulse API credentials by sending two custom HTTP headers:

  • x-sp-id: Your SendPulse API ID.
  • x-sp-secret: Your SendPulse API Secret.

Upon receiving these headers, the MCP server will automatically perform the OAuth 2.0 client_credentials flow to obtain a temporary access token from SendPulse. These tokens are cached in memory to improve performance for subsequent requests from the same user (same API ID).

Method 2: Direct OAuth Token

The client can provide a pre-existing, valid SendPulse OAuth 2.0 token directly. This is supported in two ways:

  1. Via Authorization Header (Standard):
    • Authorization: Bearer <your_oauth_token>
  2. Via MCP initialize Request Body (Legacy/Compatibility):
    • As part of the MCP JSON configuration.

Getting Started

Prerequisites

  • Node.js (v18 or later recommended)
  • npm (usually comes with Node.js)

Installation

  1. Clone the repository (if applicable).
  2. Install the project dependencies:
    npm install
    

Build

To build the project, run the following command.

npm run build

Running the Server

Once the project is built, you can start the server:

npm start

You should see a confirmation message in your console: SendPulse MCP HTTP Server running on http://localhost:3000/mcp

Exposing the TEST Server with ngrok

To make your local server accessible to services like the OpenAI sandbox, you need to expose it to the internet. You can use ngrok for this purpose. Open a new terminal window and run:

ngrok http 3000

Ngrok will provide you with a public https:// URL (e.g., https://random-string.ngrok-free.app). Use this URL (https://random-string.ngrok-free.app/mcp) when configuring the MCP tool in your LLM client.

Note: To bypass the ngrok browser warning page, you may need to configure your LLM client to send an additional header with every request, for example: ngrok-skip-browser-warning: "true".

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured