
DevBrain
Quickly search indexed indie devs blog posts and articles. It is like chatting with your favourite newsletters (coding, tech, founder). It's kind of like a web search, but specifically tuned for high-quality, developer-curated content. Use case: help implement features faster.
Tools
retrieve_knowledge
Queries DevBrain (aka `developer`s brain` system) and returns relevant information. Args: query: The question or ask to query for knowledge. tags: Optional comma-separated list of tags (keywords) to filter or ground the search. (e.g.: `ios`, `ios,SwiftUI`, `react-native`, `web`, `web,react`, `fullstack,react-native,flutter`). Do not provide more than 3 words. Returns: str: Helpful knowledge and context information from DevBrain (articles include title, short description and a URL to the full article to read it later).
read_full_article
Returns the full content of an article identified by its URL. Args: url: The URL of the article to read. Returns: str: The full content of the article or an error message.
get_token
Retrieves the stored token. Returns: str: The stored token if available, otherwise "Token not set".
set_token
Sets the token. Args: token (str): The token string to store. Returns: str: A confirmation message.
README
DevBrain MCP Server
Chat with your favorite newsletters (coding, tech, founder).
<a href="https://glama.ai/mcp/servers/@mimeCam/mcp-devbrain-stdio"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@mimeCam/mcp-devbrain-stdio/badge" alt="DevBrain MCP server" /> </a>
DevBrain pulls up relevant code snippets, indie developer articles, and blog posts, all based on what you're looking for.
It's kind of like a web search, but specifically tuned for high-quality, developer-curated content. You can easily plug in your favorite newsletters to expand its knowledge base even further.
For example, when you are implementing feature "A", DevBrain can pull related articles that would serve as a solid reference and a foundation for your implementation.
<img width="400" alt="usage-claude" src="https://github.com/user-attachments/assets/f87b80ee-7829-43e8-9223-a02a38b4fd12" /> | |
---|---|
Claude app | Goose app (tap on an image to open utube) |
DevBrain returns articles as short description + URL, you can then:
- instruct LLM agent like
Claude
orGoose
to fetch full contents of the articles using provided URLs - instruct LLM to implement a feature based on all or selected articles
Installation and Usage
Via uv
or uvx
. Install uv
and uvx
(if not installed):
curl -LsSf https://astral.sh/uv/install.sh | sh
Example command to run MCP server in stdio
mode:
uvx --from devbrain devbrain-stdio-server
Use in Claude
To add devbrain
to Claude's config, edit the file:
~/Library/Application Support/Claude/claude_desktop_config.json
and insert devbrain
to existing mcpServers
block like so:
{
"mcpServers": {
"devbrain": {
"command": "uvx",
"args": [
"--from",
"devbrain",
"devbrain-stdio-server"
]
}
}
}
Claude is known to fail when working with uv
and uvx
binaries. See related: https://gist.github.com/gregelin/b90edaef851f86252c88ecc066c93719. If you encounter this error then run these commands in a Terminal:
sudo mkdir -p /usr/local/bin
sudo ln -s ~/.local/bin/uvx /usr/local/bin/uvx
sudo ln -s ~/.local/bin/uv /usr/local/bin/uv
and restart Claude.
Integration for Cline and other AI agents
Command to start DevBrain MCP in stdio
mode:
uvx --from devbrain devbrain-stdio-server
and add this command to a config file of the AI agent (Cline or other).
Note that DevBrain requires Python 3.10+ support. Most systems have it installed. However VS Code (that Cline depends on) is shipped with Python 3.9. Use correct version of Python when running DevBrain MCP. A corrected version to launch DevBrain MCP looks like this:
uvx --python 3.10 --from devbrain devbrain-stdio-server
where Python version may be 3.10, 3.12, 3.13 (or other that is installed and available on the system).
Docker integration
You can run this MCP as a Docker container in STDIO mode. First build an image with build.sh
. Then add a config to Claude like so:
{
"mcpServers": {
"devbrain": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"mcp-devbrain-stdio:my"
]
}
}
}
Test command to verify that docker container works correctly:
docker run -i --rm mcp-devbrain-stdio:my
License
This project is released under the MIT License and is developed by mimeCam as an open-source initiative.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.