KTME - Knowledge Tracking & Management Engine

KTME - Knowledge Tracking & Management Engine

Tracks and manages code changes from Git repositories, generates documentation automatically, and provides AI agents with intelligent access to service documentation through MCP server integration with feature mapping and search capabilities.

Category
Visit Server

README

KTME - Knowledge Transfer Me

Automated documentation generation from Git changes using AI

Rust License: MIT Crates.io

KTME is a CLI tool and MCP server that automatically generates and maintains documentation from Git changes. It integrates with GitHub, GitLab, and Confluence, using AI to create meaningful documentation from code commits and pull requests.

Features

  • Smart Documentation Generation - AI-powered documentation from Git diffs, commits, and PRs
  • Multiple Integrations - GitHub, GitLab, and Confluence support
  • Template System - Customizable Markdown templates with variable substitution
  • MCP Server - Model Context Protocol server for AI agent integration
  • Dual Storage - TOML and SQLite backends for flexibility
  • Service Mapping - Organize documentation by service/project

Quick Start

Installation

# Install via npm (recommended - easiest)
npm install -g ktme-cli

# Install from crates.io
cargo install ktme

# Or build from source
cargo build --release
cargo install --path .

Basic Usage

# Generate docs from staged changes
ktme generate --service my-service --staged

# Extract GitHub PR and generate docs
ktme extract --pr 123 --provider github
ktme generate --service my-service --commit HEAD

# Update existing documentation
ktme update --service my-service --staged --section "API Changes"

# Map service to documentation location
ktme mapping add my-service --file docs/api.md

Configuration

Create ~/.config/ktme/config.toml:

[git]
github_token = "ghp_xxxxx"
gitlab_token = "glpat_xxxxx"

[confluence]
base_url = "https://your-company.atlassian.net/wiki"
api_token = "your-api-token"
space_key = "DOCS"

[ai]
provider = "openai"
api_key = "sk-xxxxx"
model = "gpt-4"

Documentation

Core Capabilities

1. Git Integration

Extract changes from various sources:

  • Staged changes (--staged)
  • Specific commits (--commit abc123)
  • Commit ranges (--range main..feature)
  • Pull/Merge requests (--pr 123)

2. Documentation Generation

Generate documentation with templates:

# Use custom template
ktme generate --service api --template api-docs

# Generate changelog
ktme generate --service api --type changelog

# Output to specific file
ktme generate --service api --output docs/changelog.md

3. Smart Updates

Update existing documentation intelligently:

# Update specific section
ktme update --service api --section "Breaking Changes"

# Smart merge with existing content
ktme update --service api --staged

4. MCP Server

Run as MCP server for AI agents:

# Start server
ktme mcp start

# Available tools:
# - ktme_generate_documentation
# - ktme_update_documentation
# - ktme_list_services
# - ktme_search_features

Development

Quick Commands

# Test changes (fast - only new modules)
make test-changes

# Run all checks (format, lint, tests)
make pre-release

# Development cycle
make dev

Publishing

# Automated release workflow
make release

See docs/RELEASE.md for complete release documentation.

Architecture

┌─────────────┐     ┌──────────────┐     ┌───────────────┐
│   Git CLI   │────▶│  Extractors  │────▶│  Generators   │
│  GitHub API │     │ (Diff/PR/MR) │     │  (Templates)  │
│  GitLab API │     └──────────────┘     └───────────────┘
└─────────────┘              │                     │
                             ▼                     ▼
                     ┌──────────────┐     ┌───────────────┐
                     │   Storage    │     │    Writers    │
                     │ (TOML/SQLite)│     │ (MD/Confluence)│
                     └──────────────┘     └───────────────┘

Recent Updates (v0.1.0)

New Features

  • ✅ Template engine with variable substitution
  • ✅ Smart documentation merging by section
  • ✅ GitHub PR extraction and integration
  • ✅ GitLab MR extraction and integration
  • ✅ Confluence writer with Markdown conversion
  • ✅ Enhanced Markdown writer with section updates

Implementation Details

  • 34 new tests (all passing)
  • 10 files modified with new functionality
  • Zero compilation errors after strict linting

See CHANGELOG.md for complete version history.

Contributing

We welcome contributions! Please see our Development Guide.

# Setup development environment
git clone https://github.com/FreePeak/ktme.git
cd ktme
make setup

# Run tests
make test-changes

# Submit PR
make pre-release
git push

License

MIT License - see LICENSE for details.

Support


Built with ❤️ using Rust

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured