@git-fabric/chat
An AI conversation management layer that enables creating chat sessions, persisting message history to GitHub, and performing semantic searches over past interactions. It supports multi-turn threading and context injection to integrate external memory sources into Claude conversations.
README
@git-fabric/chat
Chat fabric app — AI conversation sessions, semantic history search, and context threading as a composable MCP layer.
Part of the git-fabric ecosystem.
What it is
A fabric app for AI conversation management — create and manage chat sessions with Claude, persist conversation history to Qdrant Cloud (semantic search over past conversations), support multi-turn threading, and provide context injection from external memory sources (e.g. Aiana).
This is the "conversation plane" of the fabric. Consumers (cortex agents, Claude Desktop, Claude Code via git-steer) use these tools to interact with Claude across sessions.
Tools
| Tool | Description |
|---|---|
chat_session_create |
Create a new chat session with optional system prompt, project, model, and title |
chat_session_list |
List recent sessions, filtered by project and state |
chat_session_get |
Get full session with message history |
chat_session_archive |
Mark a session as archived |
chat_session_delete |
Permanently delete a session and all its messages |
chat_message_send |
Send a message and get a Claude response (full multi-turn context) |
chat_message_list |
List messages in a session with pagination |
chat_search |
Semantic search over all stored conversation content |
chat_context_inject |
Inject external context (e.g. Aiana memory recall) into a session |
chat_status |
Aggregate stats: total sessions, messages, tokens today |
chat_health |
Ping Anthropic and Qdrant, returns latency for each |
chat_thread_fork |
Fork a session at a message point to explore an alternative branch |
Architecture
Follows the git-fabric layered pattern:
Detection / Query → layers/sessions.ts, layers/search.ts (reads)
Action → layers/messages.ts, layers/sessions.ts (effectful)
Adapter → adapters/env.ts (Anthropic + OpenAI + Qdrant + GitHub)
Surface → app.ts (FabricApp factory)
State storage
- Sessions + messages → GitHub repo
ry-ops/git-steer-state(same state repo as git-steer)- Session metadata:
chat/sessions/{sessionId}.json - Message history:
chat/sessions/{sessionId}/messages.jsonl(JSONL, one message per line) - Fast listing index:
chat/index.json
- Session metadata:
- Semantic vectors → Qdrant Cloud collection
chat_fabric__messages__v1(1536-dim, text-embedding-3-small) - Completions → Anthropic API (claude-sonnet-4-6 default, configurable per session)
Usage
Via gateway (recommended)
# gateway.yaml
apps:
- name: "@git-fabric/chat"
enabled: true
Standalone MCP server
ANTHROPIC_API_KEY=sk-ant-... \
OPENAI_API_KEY=sk-... \
QDRANT_URL=https://your-cluster.qdrant.io \
QDRANT_API_KEY=... \
GITHUB_TOKEN=ghp_... \
npx @git-fabric/chat
Programmatic
import { createApp } from "@git-fabric/chat";
const app = createApp();
// app.tools, app.health(), etc.
Environment Variables
| Variable | Required | Description |
|---|---|---|
ANTHROPIC_API_KEY |
Yes | Anthropic API key for Claude completions |
OPENAI_API_KEY |
Yes | OpenAI API key for text-embedding-3-small |
QDRANT_URL |
Yes | Qdrant Cloud cluster URL |
QDRANT_API_KEY |
Yes | Qdrant Cloud API key |
GITHUB_TOKEN |
Yes | GitHub PAT for state repo read/write |
GITHUB_STATE_REPO |
No | State repo (default: ry-ops/git-steer-state) |
Models
| Model | ID |
|---|---|
| Claude Opus 4.6 | claude-opus-4-6 |
| Claude Sonnet 4.6 (default) | claude-sonnet-4-6 |
| Claude Haiku 4.5 | claude-haiku-4-5-20251001 |
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.