ContextBuilder (ctx)
A Context-as-a-Service MCP server that maintains structured, graph-based context for Shopify apps by extracting and summarizing data from web sources and help centers. It enables multi-agent systems to retrieve isolated, provenance-backed context slices or starter bundles via push and pull mechanisms.
README
ContextBuilder (ctx)
Context-as-a-Service MCP server for Hengam's multi-agent system. Maintains app-isolated, structured, provenance-backed context for Shopify apps and delivers "just enough context" to other agents.
Features
- App-scoped context: Isolated context for 4 Shopify apps (Notify Me!, Subi, Discounty, Convi)
- Agentic Graph Memory: Graph-based retrieval with multi-hop traversal, not just vector search
- Hybrid delivery: Push starter context bundles + Pull targeted context slices
- Observation Masking: Budget-aware compression with full transparency on what was included/excluded
- Provenance tracking: Every statement traceable to source URL + snapshot timestamp + content hash
- Configurable LLM: Provider-agnostic (OpenAI, Anthropic, Gemini) with editable prompt templates
- Schema-validated: All data objects validated with Zod at boundaries
Quick Start
# Install dependencies
pnpm install
# Set up LLM API key (at least one required for refresh)
export OPENAI_API_KEY=sk-...
# or
export ANTHROPIC_API_KEY=sk-ant-...
# Run the MCP server
pnpm dev
# Run tests
pnpm test
MCP Tools
| Tool | Description |
|---|---|
ctx.refresh.app_sources |
Refresh and rebuild context for an app |
ctx.push.starter_context |
Push compact starter context bundle |
ctx.pull.context_slice |
Pull targeted context slice by intent |
ctx.get.app_state_summary |
Get app state summary + refresh status |
ctx.get.provenance |
Get provenance for a bundle/slice |
Architecture
Ingestion → Extraction → Graph → Delivery
fetch summarize build push/pull
parse extract traverse mask
snapshot score validate provenance
Pipeline Flow
- Ingestion: Fetch public web sources (listing, website, help center), parse HTML, create snapshots with content hashes
- Extraction: LLM-powered structuring — summarize pages, extract concepts/procedures, score observations, detect conflicts
- Graph: Build context graph with nodes (features, procedures, constraints, FAQs, entities) and typed edges (explains, depends_on, resolves, etc.)
- Delivery: Serve context via push (starter bundles) or pull (targeted slices) with observation masking and provenance
Configuration
All configuration is in config/:
apps.yaml— App source URLs and crawl settingsmodel-profiles.yaml— LLM provider configs (model, temperature, rate limits)settings.yaml— Task bindings, budgets, masking thresholds, graph settingsprompt-templates/*.hbs— Handlebars templates for all 8 LLM tasks
Supported Apps (MVP)
| App | Listing | Website | Help Center |
|---|---|---|---|
| Notify Me! | apps.shopify.com | notify-me.io | help.notify-me.io |
| Subi | apps.shopify.com | subi.co | help.subi.co |
| Discounty | apps.shopify.com | discounty.ai | help.discounty.ai |
| Convi | apps.shopify.com | conviapp.com | help.conviapp.com |
Development
pnpm build # Compile TypeScript
pnpm dev # Run with tsx (dev mode)
pnpm test # Run all tests
pnpm test:unit # Run unit tests only
pnpm test:contract # Run MCP contract tests
pnpm lint # Type check
Requirements Coverage
Implements REQ-CTX-1 through REQ-CTX-38 from the ContextBuilder agent repository spec. See CLAUDE.md for architecture details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.