L.O.G. (Latent Orchestration Gateway)

L.O.G. (Latent Orchestration Gateway)

A privacy-first memory layer that pseudonymizes sensitive data locally before sharing a 'Working-Fiction' version with external AI agents. It enables secure agentic workflows by ensuring personally identifiable information never leaves the user's sovereign hardware.

Category
Visit Server

README

<!-- Badges --> Python 3.10+ MIT License Tests Deployed

πŸ”’ LOG-mcp β€” Your PII never touches an AI server.

Privacy middleware that strips every trace of personal data from your messages before they reach any AI API. No trust required.

$ log dehydrate "Patient John Smith (DOB 1985-03-12) called from 555-123-4567"

  Dehydrated: "Patient ENTITY_1 (DOB [DOB]) called from PHONE_1"
  Rehydrate key: session_abc123

  β†’ Send "Patient ENTITY_1 (DOB [DOB]) called from PHONE_1" to any AI.
  β†’ The AI never sees John Smith, his birthday, or his phone number.

πŸ‘‰ Try it live β€” hit the deployed Cloudflare Worker right now.


Why this matters

Scenario Risk without LOG-mcp
Healthcare β€” Sending patient notes to an LLM for summarization HIPAA violation. Real names, SSNs, and diagnoses leak to OpenAI/Anthropic servers.
Legal β€” Running attorney-client memos through AI for research Attorney-client privilege destroyed. Case details stored in third-party training data.
Finance β€” Automating fraud analysis on transaction logs PCI-DSS breach. Credit card numbers and account holders exposed to AI providers.
Multi-agent β€” Agents passing user context to sub-agents Each hop is a potential PII leak. Every endpoint is an attack surface.

LOG-mcp catches all of it at the gateway, before data leaves your infrastructure.


Architecture

 β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
 β”‚  Your     β”‚      β”‚  LOG-mcp     β”‚      β”‚  AI API   β”‚
 β”‚  App /    │─────▢│  Gateway     │─────▢│           β”‚
 β”‚  Agent    β”‚      β”‚              β”‚      β”‚  Claude   β”‚
 β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β”‚  dehydrate() β”‚      β”‚  GPT      β”‚
                    β”‚  β†’ strip PII β”‚      β”‚  Gemini   β”‚
 β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”‚  β†’ store map β”‚      β”‚  Llama    β”‚
 β”‚  Local    │◀─────│  rehydrate() │◀─────│           β”‚
 β”‚  Vault    β”‚      β”‚  β†’ restore   β”‚      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
 β”‚  (SQLite) β”‚      β”‚              β”‚
 β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Your data flows: App β†’ Gateway β†’ AI (anonymized). AI β†’ Gateway β†’ App (rehydrated). The AI only ever sees tokens like ENTITY_1 and PHONE_3.


Quick Install

git clone https://github.com/CedarBeach2019/LOG-mcp.git
cd LOG-mcp
pip install -e .

That's it. You're ready.

$ log dehydrate "Call Jane Doe at jane@example.com or 212-555-0147"
Dehydrated: "Call ENTITY_1 at EMAIL_1 or PHONE_1"
Session: sess_7f3a2c

Deployment Modes

Mode Best for Cost Latency
Local Development, privacy-critical workloads Free Lowest
Cloudflare Workers Production, serverless, global edge Free tier ~50ms
Docker Self-hosted, air-gapped, on-prem Infrastructure only Network-dependent

Local

pip install -e ".[full]"
log init              # create vault at ~/.log/vault/
log dehydrate "Your text here"

Cloudflare Workers

Free tier includes 100k requests/day, D1 database, and KV cache.

cd cloudflare
npm install
npx wrangler login
npx wrangler deploy

Endpoints: /dehydrate, /rehydrate, /stats, /health

Live demo: https://log-mcp-vault.magnus-digennaro.workers.dev/

Docker

docker build -t log-mcp .
docker run -p 8000:8000 -v log-vault:/data log-mcp

PII Detection

LOG-mcp identifies and replaces these entity types:

Entity Example Input Anonymized Output
Emails user@example.com EMAIL_1
Phone numbers +1 (555) 123-4567 PHONE_1
SSNs 123-45-6789 SSN_1
Credit cards 4532-1234-5678-9010 CC_1
Names (English) Jane Marie Smith ENTITY_1
Addresses 123 Main St, Springfield IL ADDR_1
Dates of birth 1985-03-12 [DOB]
Passport numbers US12345678 PASSPORT_1
API keys sk-proj-abc123... KEY_1
Non-ASCII PII Cyrillic/CJK names & data Redacted

CLI Reference

Command Description
log dehydrate "<text>" Strip PII, return anonymized text + session key
log rehydrate <session-id> Restore original text from vault
log init Initialize local vault (~/.log/vault/)
log stats Show vault statistics (sessions, entities, storage)
log scout <provider> "<text>" Dehydrate β†’ send to AI β†’ rehydrate response
log archive <session-id> Archive a session to long-term storage

MCP Integration

Use LOG-mcp as an MCP server:

{
  "mcpServers": {
    "log-vault": {
      "command": "python",
      "args": ["-m", "mcp.server"],
      "cwd": "/path/to/LOG-mcp"
    }
  }
}

Tools exposed: dehydrate, rehydrate, stats, list_sessions.


Testing

# Unit tests (52 tests)
pytest tests/ -v

# E2e scenario suite (46 checks: HIPAA, legal, financial, multi-agent)
pytest tests/demo_e2e.py -v

# With coverage
pytest --cov=vault --cov=mcp --cov=scouts

Project Links

πŸš€ Quickstart Guide Get running in 5 minutes
πŸ—ΊοΈ Roadmap What's coming next
🀝 Contributing Join the project
πŸ“„ License MIT

License

MIT β€” use it however you want. Star the repo if it saves you from a compliance headache.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured