ffmpeg-render-pro
MCP server for parallel video rendering with 6 tools: detect_gpu, system_info, render_video, color_grade, merge_audio, concat_videos. Live dashboard, GPU auto-detection, YouTube-optimized output.
README
╔══════════════════════════════════════════════════════╗
║ ║
║ ████████ ████████ ██ ██ ██████ ████████ ████ ║
║ ██ ██ ███ ███ ██ ██ ██ ██ ║
║ ██████ ██████ ██ ██ ██ ██████ ██████ ██ ██ ║
║ ██ ██ ██ ██ ██ ██ ██ ██ ║
║ ██ ██ ██ ██ ██ ████████ ████ ║
║ ║
║ ██████ ████████ ██ ██ ██████ ████████ ██████ ║
║ ██ ██ ██ ███ ██ ██ ██ ██ ██ ██ ║
║ ██████ ██████ ██ ██ ██ ██ ██ ██████ ██████ ║
║ ██ ██ ██ ██ ████ ██ ██ ██ ██ ██ ║
║ ██ ██ ████████ ██ ██ ██████ ████████ ██ ██ ║
║ ║
║ ██████ ██████ ████ ║
║ ██ ██ ██ ██ ██ ██ ║
║ ██████ ██████ ██ ██ ║
║ ██ ██ ██ ██ ██ ║
║ ██ ██ ██ ████ ║
║ ║
║ ▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓░░░░░░░░░░ 8 WRKRS ║
║ GPU: AUTO DASHBOARD: LIVE CONCAT: INSTANT ║
╚══════════════════════════════════════════════════════╝
ffmpeg-render-pro
Parallel video rendering with live dashboard, GPU auto-detection, checkpoint system, and stream-copy concat. The most powerful free ffmpeg rendering toolkit.
Built by Beeswax Pat with Claude Code · Free and open source forever
Features
- Parallel rendering — Split frames across N worker threads, concat with zero re-encoding
- GPU auto-detection — Probes NVENC, VideoToolbox, AMF, VA-API, QSV with 1-frame validation
- Live dashboard — Auto-opens in your browser with per-worker progress, FPS chart, ETA
- Checkpoint system — 93% reduction in fast-forward overhead for long renders
- Color grading — 5 built-in presets (noir, warm, cool, cinematic, vintage) + custom filters
- Audio merge — Combine video + audio with loudness normalization, no video re-encode
- Deterministic output — Seeded RNG ensures parallel workers produce identical results to sequential
- MCP server — Model Context Protocol server with 6 tools, works with Claude Code, Claude Desktop, and any MCP client
- Cross-platform — Windows, macOS, Linux. Any GPU or CPU-only. Requires Node.js >= 18 + ffmpeg.
Requirements
- Node.js >= 18
- ffmpeg installed and on PATH
Quick Start
# Clone or install
git clone https://github.com/beeswaxpat/ffmpeg-render-pro.git
cd ffmpeg-render-pro
# Run the benchmark (5s test render, dashboard auto-opens)
node examples/render-test.js
# Run a longer test
node examples/render-test.js --duration=30
# YouTube Shorts format (vertical 1080x1920)
node examples/render-test.js --width=1080 --height=1920 --fps=30 --duration=60
# Check your GPU
node bin/ffmpeg-render-pro.js detect-gpu
# System info (workers, RAM, CPU)
node bin/ffmpeg-render-pro.js info
CLI
ffmpeg-render-pro detect-gpu # Probe hardware encoders
ffmpeg-render-pro info # Show system config
ffmpeg-render-pro render <worker.js> # Render with your worker script
ffmpeg-render-pro benchmark # Quick 5s test render
API
const {
renderParallel, // Core: parallel rendering engine
createEncoder, // Pipe raw frames to ffmpeg
detectGPU, // Cross-platform GPU detection
getConfig, // Auto-tune workers, codec selection
concatSegments, // Stream-copy segment joining
colorGrade, // Apply color grades (presets or custom)
mergeAudio, // Combine video + audio
startDashboard, // Live progress dashboard
saveCheckpoint, // Checkpoint serialization
loadCheckpoint, // Checkpoint restoration
} = require('ffmpeg-render-pro');
renderParallel(options)
The main entry point. Splits a render across workers, shows a live dashboard, and produces a final MP4.
await renderParallel({
workerScript: './my-worker.js', // Your frame generator
outputPath: './output.mp4',
width: 1920,
height: 1080,
fps: 60,
duration: 60, // seconds
title: 'My Render',
autoOpen: true, // auto-open dashboard in browser
});
Writing a Worker
Workers receive frame ranges via workerData and pipe raw BGRA frames to ffmpeg:
const { workerData, parentPort } = require('worker_threads');
const { spawn } = require('child_process');
const { width, height, fps, startFrame, endFrame, segmentPath, workerId } = workerData;
// Spawn ffmpeg encoder
const ffmpeg = spawn('ffmpeg', [
'-y', '-f', 'rawvideo', '-pixel_format', 'bgra',
'-video_size', `${width}x${height}`, '-framerate', String(fps),
'-i', 'pipe:0',
'-c:v', 'libx264', '-preset', 'fast', '-crf', '20',
'-pix_fmt', 'yuv420p', '-movflags', '+faststart',
segmentPath,
], { stdio: ['pipe', 'pipe', 'pipe'] });
const buffer = Buffer.alloc(width * height * 4);
for (let f = startFrame; f < endFrame; f++) {
// Fill buffer with your frame data (BGRA format)
renderMyFrame(f, buffer);
// Write with backpressure
const ok = ffmpeg.stdin.write(buffer);
if (!ok) await new Promise(r => ffmpeg.stdin.once('drain', r));
// Report progress
parentPort.postMessage({ type: 'progress', workerId, pct: ..., fps: ..., frame: ..., eta: ... });
}
ffmpeg.stdin.end();
ffmpeg.on('close', () => parentPort.postMessage({ type: 'done', workerId }));
See examples/basic-worker.js for a complete working example.
Modules
| Module | Purpose |
|---|---|
parallel-renderer |
N-worker thread pool with progress tracking |
encoder |
Raw frame pipe to ffmpeg with backpressure |
gpu-detect |
Cross-platform hardware encoder discovery + validation |
config |
Auto-tune workers based on resolution, RAM, CPU |
concat |
Stream-copy segment joining (instant) |
color-grade |
ffmpeg video filter presets + custom chains |
audio-merge |
Video + audio merge with loudnorm support |
dashboard-server |
Zero-dep HTTP server with auto-open browser |
progress |
Per-worker terminal + JSON progress tracking |
checkpoint |
State serialization for long renders |
Benchmarks
Run your own:
node examples/render-test.js --duration=5
node examples/render-test.js --duration=30
node examples/render-test.js --duration=60 --width=1080 --height=1920
MCP Server
ffmpeg-render-pro includes a Model Context Protocol (MCP) server with 6 tools. Works with Claude Code, Claude Desktop, and any MCP client.
Add to Claude Code
claude mcp add --transport stdio ffmpeg-render-pro -- npx -y ffmpeg-render-pro-mcp
# Or if installed locally:
claude mcp add --transport stdio ffmpeg-render-pro -- node /path/to/src/mcp-server.mjs
Add to Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"ffmpeg-render-pro": {
"command": "node",
"args": ["/path/to/ffmpeg-render-pro/src/mcp-server.mjs"]
}
}
}
MCP Tools
| Tool | Description |
|---|---|
detect_gpu |
Probe hardware encoders (NVENC, VideoToolbox, AMF, VA-API, QSV) |
system_info |
Show CPU cores, RAM, recommended workers, ffmpeg version |
render_video |
Parallel render with live dashboard |
color_grade |
Apply presets (noir, warm, cool, cinematic, vintage) or custom filters |
merge_audio |
Combine video + audio with loudness normalization |
concat_videos |
Stream-copy join multiple videos (instant, no re-encode) |
Claude Code Skill
This repo includes a ready-to-use Claude Code skill. To install it, copy the skill folder into your Claude skills directory:
# macOS / Linux
cp -r .claude/skills/ffmpeg-render-pipeline ~/.claude/skills/
# Windows
xcopy .claude\skills\ffmpeg-render-pipeline %USERPROFILE%\.claude\skills\ffmpeg-render-pipeline\ /E /I
Once installed, Claude Code will automatically use the skill when you ask it to render video or audio with ffmpeg.
License
MIT
Author
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.