Fabric MCP Server

Fabric MCP Server

Provides access to Daniel Miessler's Fabric AI prompts (patterns and strategies) through MCP, automatically syncing with the upstream repository to enable powerful prompt templates in AI workflows.

Category
Visit Server

README

Fabric MCP Server (Docker)

A Model Context Protocol (MCP) server that exposes Daniel Miessler's Fabric patterns and strategies to MCP-compliant clients. It automatically syncs with the upstream Fabric repository, allowing you to use its powerful prompts directly within your AI workflow.

Features

  • Dynamic Sync: Clones or updates the Fabric repository every time the server starts.
  • Pattern Prompts: Automatically creates an MCP prompt for every folder in patterns/.
  • Strategy Support: Every prompt includes an optional strategy argument to prepend context from strategies/.
  • User Input: Every prompt requires an input argument for user-provided content (text, URL, etc.).
  • List Strategies Tool: Exposes a list_strategies tool to discover available strategies and their descriptions.

Prerequisites

  • Docker installed and running.

Build

Build the Docker image locally:

docker build -t fabric-mcp-server .

Running the Server

1. Manual Test (Stdio)

To test if the server starts and syncs the repository correctly:

docker run -i fabric-mcp-server

Note: The server communicates via JSON-RPC over stdin/stdout. You will see logs on stderr and can interact via the MCP Inspector.

2. Access via MCP Gateway (Claude Desktop, etc.)

To use this server with a gateway like Claude Desktop, add the following to your claude_desktop_config.json (usually located at ~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "fabric": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "fabric-mcp-server"
      ]
    }
  }
}

3. Access via Gemini CLI

To use this server with the Gemini CLI, add the following to your ~/.gemini/settings.json:

{
  "mcpServers": {
    "fabric": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "fabric-mcp-server"
      ]
    }
  }
}

4. Usage with MCP Gateway "Run" Command

If you are using a gateway that supports running servers via a direct command string, use:

docker run -i --rm fabric-mcp-server

4. Usage via Docker MCP Gateway

If you are using the Docker MCP Gateway CLI, you can run this server directly using:

docker mcp gateway run fabric-mcp-server

5. Global Registration (Docker MCP Catalog)

To make this server visible to all Docker MCP clients (like the docker mcp CLI) using the configuration files in ~/.docker/mcp/:

  1. Build the Image:

    docker build -t fabric-mcp-server .
    
  2. Create a Local Catalog: Create a file named local.yaml in your catalogs directory (usually ~/.docker/mcp/catalogs/local.yaml):

    version: 3
    name: local-catalog
    displayName: Local Catalog
    registry:
      fabric:
        title: Fabric
        description: Fabric patterns and strategies
        type: server
        image: fabric-mcp-server:latest
        tools:
          - list_strategies
        prompts: []  # One prompt per pattern in the Fabric repository
        resources: {}
        metadata:
          category: productivity
          tags:
            - fabric
            - ai
            - prompts
          owner: local
    
  3. Enable the Server: Edit your registry file (usually ~/.docker/mcp/registry.yaml) and add the fabric entry under registry::

    registry:
      # ... other servers ...
      fabric:
        ref: ""
    
  4. Verify: Run docker mcp list (if available) or simply try running it:

    docker mcp gateway run fabric
    

How it Works

  1. Startup: The server clones https://github.com/danielmiessler/fabric into the container.
  2. Prompts: It scans folders like extract_wisdom, summarize, etc. in patterns/.
  3. Execution:
    • When you select a prompt (e.g., extract_wisdom), it reads the system.md file.
    • If you provide a strategy (e.g., cot), it fetches the strategy JSON, extracts the prompt content, and prepends it to the system message.
    • The input argument content is appended to the end of the prompt.
  4. Tools: Use list_strategies to discover available strategies and their descriptions.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured