Congo River Compositional Intelligence
Provides tools for semantic decomposition, proof search, knowledge graph operations, and neuro-symbolic reasoning that bridges neural LLMs with symbolic AI through RDF triples, lambda calculus, and compositional intelligence principles.
README
Congo River Compositional Intelligence MCP Server
Status: ποΈ Phase 1 in Progress (Foundation Complete!)
A production-grade MCP (Model Context Protocol) server that embodies compositional intelligence principles, providing tools for semantic decomposition, proof search, knowledge graphs, and neuro-symbolic reasoning.
π The Congo River Philosophy
This project implements "Congo River Compositional Intelligence" - the idea that powerful understanding emerges from thousands of tributaries (simple reasoning operations) composing into one massive flow (deep intelligence). Key principles:
- Compositional Structure: Complex reasoning built from simple, composable operations
- Polyglot Architecture: Each component implemented in its optimal language
- Semantic Foundations: Grounded in RDF triples, lambda calculus, and proof theory
- Neuro-Symbolic Integration: Bridges neural (LLMs) and symbolic (knowledge graphs) AI
π Quick Start
Prerequisites
- Node.js 18+
- Python 3.10+
- Supabase account (or local PostgreSQL with pgvector)
- Anthropic and/or OpenAI API keys
Installation
# Clone or navigate to directory
cd /home/mdz-axolotl/ClaudeCode/congo-river-mcp
# Install Node dependencies
npm install
# Install Python dependencies
pip install -r requirements.txt
python -m spacy download en_core_web_sm
# Configure environment
cp .env.example .env
# Edit .env with your Supabase URL and API keys
# Build TypeScript
npm run build
# Initialize database
npm start -- --setup
# Start server
npm start
Configuration
Edit .env with your settings:
# Use Supabase
DB_TYPE=cloud
CLOUD_DB_URL=postgresql://postgres:[PASSWORD]@[PROJECT-REF].supabase.co:5432/postgres
# Add your API keys
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
Add to Claude Code
Add to your .mcp.json:
{
"mcpServers": {
"congo-river": {
"command": "node",
"args": ["dist/server.js"],
"cwd": "/home/mdz-axolotl/ClaudeCode/congo-river-mcp",
"type": "stdio",
"env": {
"TRANSPORT": "stdio",
"DB_TYPE": "cloud",
"CLOUD_DB_URL": "postgresql://...",
"ANTHROPIC_API_KEY": "sk-ant-...",
"OPENAI_API_KEY": "sk-..."
}
}
}
}
π οΈ Available Tools
Core Reasoning Tools
1. triple_decomposition
- Decomposes concepts into RDF subject-predicate-object triples
- Implements Stanley Fish's 3-word sentence principle
- Stores in knowledge graph for later querying
2. lambda_abstraction
- Converts processes/code into lambda calculus
- Shows compositional structure with type signatures
- Applies beta reduction for simplification
3. proof_search
- Searches for proofs given goals and premises
- Multiple strategies: forward/backward chaining, resolution
- Returns proof trees (Curry-Howard correspondence)
4. graph_query
- Queries knowledge graph with SPARQL-like patterns
- Natural language or structured queries
- Returns matching triples and relationships
5. neuro_symbolic_query β Showcase Feature
- Hybrid reasoning: LLM + knowledge graph
- Parses natural language β logical form
- Queries graph symbolically
- Synthesizes grounded answers with proof traces
Meta Tools
6. recommend_language
- Analyzes requirements and recommends optimal programming language
- Shows scoring rationale and trade-offs
- Demonstrates meta-level compositional intelligence
7. configure_database
- Database management: status, health, migrations, stats
- Switches between local/cloud configurations
8. export_knowledge
- Exports knowledge graph to RDF or JSON
- Backup and portability
9. import_knowledge
- Imports triples into knowledge graph
- Bulk loading from external sources
10. system_status
- Comprehensive system health check
- Database stats, service status, tool inventory
π Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββ
β Claude Code (User) β
ββββββββββββββββββ¬βββββββββββββββββββββββββββββββββ
β MCP Protocol (STDIO/SSE)
ββββββββββββββββββΌβββββββββββββββββββββββββββββββββ
β Congo River MCP Server (TypeScript) β
β ββββββββββββββββββββββββββββββββββββββββββββ β
β β Language Selection Scoring (Meta-Layer) β β
β ββββββββββββββββββββββββββββββββββββββββββββ β
β ββββββββββββ¬βββββββββββ¬βββββββββββββββββββ β
β β Core β Advanced β Meta Tools β β
β β Tools β Tools β (DB, Language) β β
β ββββββ¬ββββββ΄βββββ¬ββββββ΄βββββββββ¬ββββββββββ β
βββββββββΌβββββββββββΌβββββββββββββββΌββββββββββββββ
β β β
βββββββΌβββ βββββΌβββββ ββββββΌββββββ
βPython β βTypeScr.β β Database β
βServicesβ βServicesβ β Manager β
ββββββ¬ββββ βββββ¬βββββ ββββββ¬ββββββ
ββββββββββββ΄βββββββββββββββ
β
βββββββββββΌββββββββββββββββββββββ
β Supabase PostgreSQL+pgvector β
β β’ RDF Triples β’ Proofs β
β β’ Embeddings β’ Patterns β
βββββββββββββββββββββββββββββββββ
ποΈ Database Schema
The PostgreSQL schema includes:
triples- RDF knowledge graph storageproofs- Proof trees and inference tracesreasoning_sessions- Tool invocation historyembeddings- Vector embeddings (pgvector)patterns- Learned compositional patternslambda_abstractions- Lambda calculus representationsconcept_nodes&concept_edges- Meta-level concept graph
π§ Language Selection System
The server includes an automatic language recommendation engine that scores programming languages based on task requirements:
// Example: What language for semantic web operations?
recommend_language({
task_profile: "graphQuery"
})
// Result: Python (92.3/100)
// Strong fit for: semantic web, graph operations
// Excellent rdflib ecosystem
Supported Languages: TypeScript, Python, Prolog, Rust, Go
Scoring Dimensions:
- Logic programming capabilities
- Graph/RDF operations
- Type system strength
- Performance characteristics
- ML/AI ecosystem
- Semantic web support
- Concurrency model
- Web integration
π Conceptual Foundation
This system is grounded in deep theoretical connections:
- J.D. Atlas - Semantic generality and presupposition
- Richard Montague - Compositional semantics and type theory
- Curry-Howard - Proofs as programs isomorphism
- Tim Berners-Lee - RDF and semantic web
- Modern LLMs - Neural learning of compositional structure
See: /home/mdz-axolotl/Documents/congo-river-compositional-intelligence.md for the complete theoretical framework.
π― Roadmap
β Phase 1 (Current)
- [x] Project structure and configuration
- [x] Database schema (PostgreSQL + pgvector)
- [x] Database manager (local/cloud support)
- [x] Language selection scoring system
- [x] Main MCP server with 10 tools
- [ ] Python services implementation
- [ ] TypeScript lambda service
- [ ] Neuro-symbolic integration
- [ ] End-to-end testing
Phase 2: Enhanced Reasoning
- Tree of Thoughts orchestrator
- Chain of Thought tracer
Phase 3: Meta-Cognitive Layer
- Compositional analyzer (multi-lens analysis)
- Loop discovery engine
Phase 4-7: Learning, Production, Knowledge Management, Advanced Neuro-Symbolic
(See full roadmap in /home/mdz-axolotl/.claude/plans/serialized-meandering-starlight.md)
π§ͺ Development
# Run in watch mode
npm run dev
# Run tests
npm test
# Lint
npm run lint
# Format
npm run format
# Start with SSE transport (remote access)
npm run start:sse
π Example Usage
// In Claude Code, you can call:
// Decompose a concept
triple_decomposition({
concept: "Consciousness is awareness of internal and external stimuli",
store_in_db: true
})
// Get language recommendation
recommend_language({
task_profile: "neuroSymbolic",
show_all: true
})
// Query knowledge graph
graph_query({
query: "Find all properties of consciousness"
})
// Neuro-symbolic reasoning
neuro_symbolic_query({
query: "What is the relationship between consciousness and qualia?",
include_proof: true
})
// System health
system_status({ detailed: true })
π€ Contributing
This is a research/educational project exploring compositional intelligence. Contributions welcome!
π License
MIT
**π The Congo River flows with unstoppable force from thousands of tributaries composing into one.**Human: can we save this session?
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.