MemeStack MCP
MemeStack MCP — hosted MCP server for searching an AI-tagged image gallery (memes, infographics, charts, visual explainers) ranked by Lightning zaps. 19 tools, 6 prompts, 3 resources. Every response includes ready-to-paste citation blocks (markdown / HTML / plain). Free, no auth: https://mcp.memestack.ai/mcp
README
MemeStack MCP
Hosted Model Context Protocol server for MemeStack — a searchable gallery of AI-tagged memes, infographics, charts, screenshots, and visual explainers, ranked by Lightning zaps. Free, public, no auth, no signup.
https://mcp.memestack.ai/mcp
Every list response returns citation blocks (markdown / HTML / plain) ready to paste with attribution. Image bytes are served from https://api.memestack.ai/v1/images/{id}/{thumbnail|canonical|social-card} — directly embeddable.
Install
This is a hosted server. No clone, no build, no local environment. Point any MCP-aware client at the endpoint.
Claude Code
claude mcp add --transport http memestack https://mcp.memestack.ai/mcp
Claude Desktop
Edit claude_desktop_config.json (%APPDATA%\Claude\ on Windows, ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"memestack": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.memestack.ai/mcp"]
}
}
}
Recent Claude Desktop versions support direct HTTP MCP servers — if yours does, you can drop the mcp-remote wrapper:
{
"mcpServers": {
"memestack": { "url": "https://mcp.memestack.ai/mcp" }
}
}
Cursor
Edit ~/.cursor/mcp.json (or per-project .cursor/mcp.json):
{
"mcpServers": {
"memestack": { "url": "https://mcp.memestack.ai/mcp" }
}
}
Continue.dev (VS Code / JetBrains)
In ~/.continue/config.json, add under mcpServers:
"memestack": { "url": "https://mcp.memestack.ai/mcp" }
Raw JSON-RPC (any HTTP client)
curl -X POST https://mcp.memestack.ai/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'
Same endpoint for initialize, tools/call, prompts/list, prompts/get, resources/list, resources/read. Protocol version: 2025-06-18.
What's exposed
- 18 free tools + 1 enterprise stub — full catalog: docs/tools.md
- 6 prompts — pre-baked workflows (topic search, top zapped, cite a meme, find a meme for a vibe, research meme evolution, trending): docs/prompts.md
- 3 resources — attribution guide, tag taxonomy, recent uploads feed: docs/resources.md
Tools at a glance
Discovery and search:
search_images— semantic + keyword mergedsearch_text_in_image— OCR-only search (find screenshots of specific quotes/text)find_meme_for_text— vibe-to-meme matcher for writing & socialreverse_image_search— phash-based "find this image" (accepts HTTPS ordata:URLs)find_similar/find_related— neighbors of a known image (visual phash / semantic embedding)browse_images,browse_by_tag,browse_by_category,list_categoriespopular_tags,tag_autocomplete,get_tag_profileget_image,get_user_profile,get_leaderboard,get_mutation_group
Attribution:
cite_image— canonical markdown/HTML/plain attribution blocks for one or many image IDs
Enterprise stub:
submit_image— reserved for a future agent-tier monetization spec; currently returns a polite redirect
Citations are baked in
Every list response includes a citations_combined block in three formats (markdown / HTML / plain) for the full set, plus a per-image citation on every individual image. Per-source rules differ — OWID images carry CC-BY 4.0 attribution to Our World in Data; Imgflip templates carry Imgflip attribution; direct uploads carry the uploader's display name and the MemeStack page URL.
Read memestack://attribution-guide once per session for the license model and per-source rules. See docs/resources.md.
Tier model
Currently everything except submit_image is tier: free with no per-call cost. IP-level rate limits apply for abuse protection. reverse_image_search is rate-limited to 10/min/IP since each call hashes the input image.
submit_image is tier: enterprise and reserved for a future agent-tier monetization flow — calling it returns a discoverable error pointing at memestack.ai/mcp/agent-tier.
Discovery surface
How MemeStack exposes itself to AI agents, beyond MCP:
| Surface | URL | Purpose |
|---|---|---|
llms.txt |
memestack.ai/llms.txt | Concise human-readable summary + endpoint links |
ai-plugin.json |
memestack.ai/.well-known/ai-plugin.json | ChatGPT plugin manifest |
| OpenAPI 3.1 | api.memestack.ai/openapi.json | REST API the MCP wraps |
| Apex MCP mirror | memestack.ai/mcp | Same MCP endpoint at apex (for naive scanners) |
| oEmbed | api.memestack.ai/v1/oembed | Photo-type oEmbed for gallery URLs |
| Sitemap | memestack.ai/sitemap.xml | Sitemap index with image, page, and user sub-sitemaps |
Source
The MCP server runs on Cloudflare Workers and proxies the public MemeStack REST API (api.memestack.ai). The server's source is part of the larger MemeStack codebase, which is not publicly mirrored at this time. This repository hosts the public-facing docs, install snippets, and tool catalog so directory submissions and integrators can link to a stable, browsable surface.
If you want to fork the protocol behavior and self-host a similar gallery, the REST API is documented and a sufficient backend for any MCP-style wrapper.
License
MIT — covers the snippets and docs in this repo. The hosted MemeStack service is governed separately by memestack.ai/terms. Image content carries per-source licenses, surfaced in each image's citation block.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.