Refactory

Refactory

Hybrid code decomposition — AI plans module boundaries, deterministic engine handles routine extractions. Split monoliths into clean modules. JS and Python built in, AGPL-3.0.

Category
Visit Server

README

Refactory

License: AGPL-3.0 Node.js MCP Compatible API Cost Discord

Hybrid code decomposition. AI plans the boundaries. A deterministic engine handles the routine extractions. Minimize tokens, maximize syntax validity.

Refactory splits monolith source files into clean modules. It uses an LLM for one thing — deciding which functions group together. Everything else is mechanical: function boundary detection, import resolution, module assembly, syntax validation, scoring.

JavaScript and Python extraction is mostly mechanical. The deterministic engine handles the straightforward moves — the routine 80% that's a waste of AI time and tokens. The LLM still handles complex edge cases where judgment matters. Other languages use LLM extraction with adaptive compression.

Works with Claude Code, Cursor, Windsurf, VS Code Copilot — any MCP client. Or use the CLI directly.

Results

Tested against 15 production monoliths:

Metric Value
Lines decomposed 32,736
Functions extracted 1,017
Pipeline score 0.89
Mechanical extraction ratio ~80%
API cost (extraction) Near zero

Quick Start

MCP (recommended)

Add to your .mcp.json:

{
  "mcpServers": {
    "refactory": {
      "command": "npx",
      "args": ["@refactory/mcp"],
      "env": {
        "GROQ_API_KEY": "your-key-here"
      }
    }
  }
}

Then tell your AI tool: "Analyze and decompose src/big-file.js into modules"

One free API key (Groq or Gemini) is needed for the PLAN step only. Extraction is mechanical — no key required for JS/Python.

CLI

git clone https://github.com/codedrop-codes/refactory.git
cd refactory && npm install
node src/cli.js decompose src/big-file.js

Other commands:

refactory analyze src/big-file.js        # Health check + function map
refactory plan src/big-file.js           # Generate module boundaries (needs LLM key)
refactory verify lib/modules/            # Check extracted modules
refactory languages                      # Show supported languages
refactory providers                      # Show configured LLM providers
refactory test submit broken.js          # Submit a file that breaks extraction
refactory test run                       # Validate preprocessors against test corpus

How It Works

  1. ANALYZE         Scan functions, dependencies, health — mechanical
       |
  2. CHARACTERIZE    Snapshot exports before touching anything — mechanical
       |
  3. PLAN            LLM decides module boundaries — the only AI step
       |
  4. EXTRACT         Copy functions by line range, resolve imports — mechanical
       |               (LLM fallback for unsupported languages)
  5. FIX-IMPORTS     Rewrite require()/import paths — mechanical
       |
  6. VERIFY          Syntax check, load check, export comparison — mechanical
       |
  7. METRICS         Refactory Score + HTML report — mechanical

6 of 7 steps are deterministic. The LLM only decides where to split — it never touches your code.

Language Support

Language Extraction Status
JavaScript / TypeScript Mechanical Built in
Python Mechanical Built in
Go, Rust, Java, C#, Kotlin, Swift Mechanical Pro
Everything else LLM with compression Automatic fallback

Mechanical extraction handles the routine cases: the preprocessor finds function boundaries by parsing, copies them by line range, and resolves imports deterministically. Complex patterns (dynamic exports, deeply interleaved logic) still go through the LLM.

Contribute a preprocessor for your language.

Refactory Score

A single number (0.0 to 1.0) that measures decomposition quality.

Score = clean_rate × size_reduction
  • clean_rate — modules that load without errors / total modules
  • size_reduction — 1 − (largest module / original file)

A score of 1.0 means every module loads cleanly and no module is bigger than the original.

Provider Routing

You only need one free key for the PLAN step. Extraction is mechanical for supported languages.

Provider Output Context Free?
Groq Llama 3.3 70B 32k 128k Yes
Gemini 2.5 Flash 16k 1M Yes
OpenRouter Qwen 3.6+ 16k 1M Yes
SambaNova MiniMax 16k 163k Yes

Set at least one: GROQ_API_KEY, GOOGLE_API_KEY, OPENROUTER_API_KEY, or SAMBANOVA_API_KEY.

Test Corpus

Found a file that breaks extraction? Submit it:

refactory test submit broken-file.js -d "what went wrong"

Secrets are stripped automatically. Every submission becomes a permanent test case. The extractor gets stronger with every report.

Report via GitHub if you prefer.

Community

License

AGPL-3.0 — see LICENSE.

Premium language packs available under commercial license. See refactory.codedrop.codes.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured