
Taboola API MCP Server
A flexible MCP server that enables users to fetch recommendations from the Taboola API using publisher credentials, supporting both local (STDIO) and remote (HTTP) deployment modes.
README
Taboola API MCP Server
A flexible MCP (Model Context Protocol) server with fetchRecommendations functionality. Supports both local (STDIO) and remote (HTTP) deployment modes.
Setup
- Install dependencies:
pip install -r requirements.txt
- Activate virtual environment (if using one):
source .venv/bin/activate
Deployment Options
Local Mode (STDIO Transport)
Perfect for local development and testing with MCP Inspector:
# Default mode - runs locally with STDIO transport
python server.py
# Explicitly specify local mode
python server.py --mode local
Remote Mode (HTTP Server)
Deploy as a remote HTTP server accessible over the network:
# Run as HTTP server on default port 8000
python server.py --mode remote
# Specify custom host and port
python server.py --mode remote --host 0.0.0.0 --port 3000
# Using environment variables
export MCP_MODE=remote
export MCP_HOST=0.0.0.0
export MCP_PORT=8000
python server.py
Configuration Options
Command Line Arguments
--mode
: Server mode (local
orremote
) - default:local
--host
: Host to bind to in remote mode - default:0.0.0.0
--port
: Port to bind to in remote mode - default:8000
Environment Variables
MCP_MODE
: Server mode (local
orremote
)MCP_HOST
: Host to bind to in remote modeMCP_PORT
: Port to bind to in remote mode
Environment variables override command line arguments.
Functions
fetchRecommendations
Fetches recommendations for a given publisher using their API key via Taboola API.
Parameters:
publisher_name
(str): The name of the publisherapi_key
(str): The API key for authentication
Returns:
str
: JSON recommendations data from Taboola API
Usage Examples
Local Development with MCP Inspector
# Start server locally
python server.py
# In another terminal, run MCP Inspector
npx @modelcontextprotocol/inspector python server.py
Remote Deployment
# Deploy as remote server
python server.py --mode remote --port 8000
# Server will be available at: http://your-server-ip:8000
# Connect using HTTP transport with MCP clients
Production Deployment
For production, consider using environment variables:
export MCP_MODE=remote
export MCP_HOST=0.0.0.0
export MCP_PORT=8000
python server.py
Or with a process manager like PM2:
pm2 start server.py --name "taboola-mcp" -- --mode remote --port 8000
Testing
Use the provided test script to verify functionality:
# Edit test_function.py with your credentials
python test_function.py
Cloud Deployment
Render Deployment
Deploy easily on Render cloud platform:
Option 1: Using Render.yaml (Recommended)
-
Push your code to GitHub/GitLab
-
Connect to Render:
- Go to Render Dashboard
- Click "New" > "Blueprint"
- Connect your repository
- The
render.yaml
file will be automatically detected
-
Deploy:
- Render will automatically build and deploy your MCP server
- Your server will be available at:
https://your-app-name.onrender.com
Option 2: Manual Render Setup
-
Create a new Web Service on Render
-
Connect your repository
-
Configure the service:
- Build Command:
pip install -r requirements.txt
- Start Command:
python server.py --mode remote --host 0.0.0.0 --port $PORT
- Environment Variables:
MCP_MODE=remote
MCP_HOST=0.0.0.0
PYTHON_VERSION=3.13.0
- Build Command:
-
Deploy and get your URL
Docker Deployment
For any Docker-compatible platform:
# Build and run locally
docker build -t taboola-mcp-server .
docker run -p 8000:8000 taboola-mcp-server
# Or use docker-compose
docker-compose up -d
Other Cloud Platforms
The server is compatible with:
- Heroku: Use
Procfile
withweb: python server.py --mode remote --port $PORT
- Railway: Deploy directly from GitHub with automatic detection
- DigitalOcean App Platform: Use the provided
docker-compose.yml
- AWS/GCP/Azure: Deploy using Docker or direct Python deployment
Security Notes
- In remote mode, the server binds to
0.0.0.0
by default (all interfaces) - Consider using a reverse proxy (nginx, Apache) for production deployments
- Ensure proper firewall rules are in place for remote access
- API keys are passed as parameters - ensure secure transmission (HTTPS recommended)
- Cloud platforms like Render automatically provide HTTPS endpoints
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.