Semantic Context MCP
Enables AI assistants to save, load, and search conversation context with AI-powered summarization and auto-tagging. Demonstrates semantic intent patterns and hexagonal architecture for maintainable AI-assisted development.
README
Semantic Context MCP
Reference implementation of Semantic Intent as Single Source of Truth patterns
A Model Context Protocol (MCP) server demonstrating semantic anchoring, intent preservation, and observable property patterns for AI-assisted development.
📚 Table of Contents
- What Makes This Different
- Quick Start
- Architecture
- Features
- Testing
- Database Setup
- Contributing
- Security
- License
🎯 What Makes This Different
This isn't just another MCP server—it's a reference implementation of proven semantic intent patterns:
- ✅ Semantic Anchoring: Decisions based on meaning, not technical characteristics
- ✅ Intent Preservation: Semantic contracts maintained through all transformations
- ✅ Observable Properties: Behavior anchored to directly observable semantic markers
- ✅ Domain Boundaries: Clear semantic ownership across layers
Built on research from Semantic Intent as Single Source of Truth, this implementation demonstrates how to build maintainable, AI-friendly codebases that preserve intent.
🚀 Quick Start
Prerequisites
- Node.js 20.x or higher
- Cloudflare account (free tier works)
- Wrangler CLI:
npm install -g wrangler
Installation
-
Clone the repository
git clone https://github.com/semanticintent/semantic-context-mcp.git cd semantic-context-mcp -
Install dependencies
npm install -
Configure Wrangler
Copy the example configuration:
cp wrangler.jsonc.example wrangler.jsoncCreate a D1 database:
wrangler d1 create mcp-contextUpdate
wrangler.jsoncwith your database ID:{ "d1_databases": [{ "database_id": "your-database-id-from-above-command" }] } -
Run database migrations
# Local development wrangler d1 execute mcp-context --local --file=./migrations/0001_initial_schema.sql # Production wrangler d1 execute mcp-context --file=./migrations/0001_initial_schema.sql -
Start development server
npm run dev
Deploy to Production
npm run deploy
Your MCP server will be available at: semantic-context-mcp.<your-account>.workers.dev
📚 Learning from This Implementation
This codebase demonstrates semantic intent patterns throughout:
Architecture Files:
- src/index.ts - Dependency injection composition root (74 lines)
- src/domain/ - Business logic layer (ContextSnapshot, ContextService)
- src/application/ - Orchestration layer (handlers and protocol)
- src/infrastructure/ - Technical adapters (D1, AI, CORS)
- src/presentation/ - HTTP routing layer (MCPRouter)
Documentation & Patterns:
- migrations/0001_initial_schema.sql - Schema with semantic intent documentation
- src/types.ts - Type-safe semantic contracts
- SEMANTIC_ANCHORING_GOVERNANCE.md - Governance rules and patterns
- REFACTORING_PLAN.md - Complete refactoring documentation
Each file includes comprehensive comments explaining WHY decisions preserve semantic intent, not just WHAT the code does.
Connect to Cloudflare AI Playground
You can connect to your MCP server from the Cloudflare AI Playground, which is a remote MCP client:
- Go to https://playground.ai.cloudflare.com/
- Enter your deployed MCP server URL (
remote-mcp-server-authless.<your-account>.workers.dev/sse) - You can now use your MCP tools directly from the playground!
Connect Claude Desktop to your MCP server
You can also connect to your remote MCP server from local MCP clients, by using the mcp-remote proxy.
To connect to your MCP server from Claude Desktop, follow Anthropic's Quickstart and within Claude Desktop go to Settings > Developer > Edit Config.
Update with this configuration:
{
"mcpServers": {
"semantic-context": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8787/sse" // or semantic-context-mcp.your-account.workers.dev/sse
]
}
}
}
Restart Claude and you should see the tools become available.
🏗️ Architecture
This project demonstrates Domain-Driven Hexagonal Architecture with clean separation of concerns:
┌─────────────────────────────────────────────────────────┐
│ Presentation Layer │
│ (MCPRouter - HTTP routing) │
└────────────────────┬────────────────────────────────────┘
│
┌────────────────────▼────────────────────────────────────┐
│ Application Layer │
│ (ToolExecutionHandler, MCPProtocolHandler) │
│ MCP Protocol & Orchestration │
└────────────────────┬────────────────────────────────────┘
│
┌────────────────────▼────────────────────────────────────┐
│ Domain Layer │
│ (ContextService, ContextSnapshot) │
│ Business Logic │
└────────────────────┬────────────────────────────────────┘
│
┌────────────────────▼────────────────────────────────────┐
│ Infrastructure Layer │
│ (D1ContextRepository, CloudflareAIProvider) │
│ Technical Adapters (Ports & Adapters) │
└─────────────────────────────────────────────────────────┘
Layer Responsibilities:
Domain Layer (src/domain/):
- Pure business logic independent of infrastructure
ContextSnapshot: Entity with validation rulesContextService: Core business operations
Application Layer (src/application/):
- Orchestrates domain operations
ToolExecutionHandler: Translates MCP tools to domain operationsMCPProtocolHandler: Manages JSON-RPC protocol
Infrastructure Layer (src/infrastructure/):
- Technical adapters implementing ports (interfaces)
D1ContextRepository: Cloudflare D1 persistenceCloudflareAIProvider: Workers AI integrationCORSMiddleware: Cross-cutting concerns
Presentation Layer (src/presentation/):
- HTTP routing and request handling
MCPRouter: Routes requests to appropriate handlers
Composition Root (src/index.ts):
- Dependency injection
- Wires all layers together
- 74 lines (down from 483 - 90% reduction)
Benefits:
- ✅ Testability: Each layer independently testable
- ✅ Maintainability: Clear responsibilities per layer
- ✅ Flexibility: Swap infrastructure (D1 → Postgres) without touching domain
- ✅ Semantic Intent: Comprehensive documentation of WHY
- ✅ Type Safety: Strong TypeScript contracts throughout
Features
- save_context: Save conversation context with AI-powered summarization and auto-tagging
- load_context: Retrieve relevant context for a project
- search_context: Search contexts using keyword matching
🧪 Testing
This project includes comprehensive unit tests with 70 tests covering all architectural layers.
Run Tests
# Run all tests
npm test
# Run tests in watch mode
npm run test:watch
# Run tests with UI
npm run test:ui
# Run tests with coverage report
npm run test:coverage
Test Coverage
- ✅ Domain Layer: 15 tests (ContextSnapshot validation, ContextService orchestration)
- ✅ Application Layer: 10 tests (ToolExecutionHandler, MCP tool dispatch)
- ✅ Infrastructure Layer: 20 tests (D1Repository, CloudflareAIProvider with fallbacks)
- ✅ Presentation Layer: 12 tests (MCPRouter, CORS, error handling)
- ✅ Integration: 13 tests (End-to-end service flows)
Test Structure
Tests are co-located with source files using the .test.ts suffix:
src/
├── domain/
│ ├── models/
│ │ ├── ContextSnapshot.ts
│ │ └── ContextSnapshot.test.ts
│ └── services/
│ ├── ContextService.ts
│ └── ContextService.test.ts
├── application/
│ └── handlers/
│ ├── ToolExecutionHandler.ts
│ └── ToolExecutionHandler.test.ts
└── ...
All tests use Vitest with mocking for external dependencies (D1, AI services).
Continuous Integration
This project uses GitHub Actions for automated testing and quality checks.
Automated Checks on Every Push/PR:
- ✅ TypeScript compilation (
npm run type-check) - ✅ Unit tests (
npm test) - ✅ Test coverage reports
- ✅ Code formatting (Biome)
- ✅ Linting (Biome)
Status Badges:
- CI status displayed at top of README
- Automatically updates on each commit
- Shows passing/failing state
Workflow Configuration: .github/workflows/ci.yml
The CI pipeline runs on Node.js 20.x and ensures code quality before merging.
Database Setup
This project uses Cloudflare D1 for persistent context storage.
Initial Setup
-
Create D1 Database:
wrangler d1 create mcp-context -
Update
wrangler.jsoncwith your database ID:{ "d1_databases": [ { "binding": "DB", "database_name": "mcp-context", "database_id": "your-database-id-here" } ] } -
Run Initial Migration:
wrangler d1 execute mcp-context --file=./migrations/0001_initial_schema.sql
Local Development
For local testing, initialize the local D1 database:
wrangler d1 execute mcp-context --local --file=./migrations/0001_initial_schema.sql
Verify Schema
Check that tables were created successfully:
# Production
wrangler d1 execute mcp-context --command="SELECT name FROM sqlite_master WHERE type='table'"
# Local
wrangler d1 execute mcp-context --local --command="SELECT name FROM sqlite_master WHERE type='table'"
Database Migrations
All database schema changes are managed through versioned migration files in migrations/:
0001_initial_schema.sql- Initial context snapshots table with semantic indexes
See migrations/README.md for detailed migration management guide.
License
This project is licensed under the MIT License - see the LICENSE file for details.
🔬 Research Foundation
This implementation is based on the research paper "Semantic Intent as Single Source of Truth: Immutable Governance for AI-Assisted Development".
Core Principles Applied:
- Semantic Over Structural - Use meaning, not technical characteristics
- Intent Preservation - Maintain semantic contracts through transformations
- Observable Anchoring - Base behavior on directly observable properties
- Immutable Governance - Protect semantic integrity at runtime
Related Resources:
- Research Paper (coming soon)
- Semantic Anchoring Governance
- semanticintent.dev (coming soon)
🤝 Contributing
We welcome contributions! This is a reference implementation, so contributions should maintain semantic intent principles.
How to Contribute
- Read the guidelines: CONTRIBUTING.md
- Check existing issues: Avoid duplicates
- Follow the architecture: Maintain layer boundaries
- Add tests: All changes need test coverage
- Document intent: Explain WHY, not just WHAT
Contribution Standards
- ✅ Follow semantic intent patterns
- ✅ Maintain hexagonal architecture
- ✅ Add comprehensive tests
- ✅ Include semantic documentation
- ✅ Pass all CI checks
Quick Links:
- Contributing Guide - Detailed guidelines
- Code of Conduct - Community standards
- Architecture Guide - Design principles
- Security Policy - Report vulnerabilities
Community
- 💬 Discussions - Ask questions
- 🐛 Issues - Report bugs
- 🔒 Security - Report vulnerabilities privately
🔒 Security
Security is a top priority. Please review our Security Policy for:
- Secrets management best practices
- What to commit / what to exclude
- Reporting security vulnerabilities
- Security checklist for deployment
Found a vulnerability? Email: security@semanticintent.dev
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.