Open-source, AI-powered video production engine built on Remotion.
Define a timeline in TypeScript, point it at your footage and transcript, render a polished MP4.
Built for product demos, LinkedIn content, YouTube videos, and social media clips.
OpenCut is a programmatic video production engine that turns raw footage, screen recordings, and transcripts into professional MP4 videos using code. It combines:
Unlike traditional video editors that lock you into GUIs and proprietary formats, OpenCut treats video as code — fully version-controlled, reproducible, and automatable.
Get a rendered MP4 in under 5 minutes:
# 1. Clone and install
git clone https://github.com/floomhq/opencut.git
cd opencut
npm install
# 2. Scaffold a new video project
npx opencut-init my-video
# 3. Drop your facecam.mp4 into public/
# 4. Edit the timeline
# src/examples/my-video/timeline.ts
# 5. Validate assets and timings
npx opencut-validate src/examples/my-video/timeline.ts
# 6. Render your video
npx opencut-render my-video
Find your finished video at out/my-video.mp4.
For a detailed walkthrough, see QUICKSTART.md.
| Problem | Traditional Editors | OpenCut |
|---|---|---|
| Batch rendering | Manual export per video | npx opencut-render project-name |
| Subtitles | Manual typing or expensive services | Whisper AI → auto-generated word-level captions |
| Version control | Binary project files | Plain-text TypeScript — diffable, reviewable |
| Automation | No API | Full CLI + programmable timeline |
| Custom effects | Limited presets | React components + plugin system |
| Reproducibility | "It works on my machine" | Deterministic rendering from code |
git clone https://github.com/floomhq/opencut.git
cd opencut
npm install
npm install -g opencut
Or install locally in your project:
npm install opencut
Record ──► Transcribe ──► Configure ──► Validate ──► Render
│ │ │ │ │
│ │ │ │ ▼
facecam Whisper CLI timeline.ts validate.ts out/*.mp4
+ screen --word_ + config.ts checks assets
timestamps
--word_timestamps True) for word-level captions.| Command | Description |
|---|---|
npx opencut-init <name> |
Scaffold a new project with config, timeline, subtitles, Root.tsx, and index.ts |
npx opencut-transcribe <video> |
Run Whisper AI transcription or generate a subtitle template |
npx opencut-validate <timeline.ts> |
Check asset paths, segment durations, and timeline contiguity |
npx opencut-render <name> |
Render the full video for a project |
npx opencut-render <name> --preview |
Open the Remotion Studio preview UI |
npx opencut-render <name> --watch |
Watch files and auto-re-render on changes |
npx opencut-render <name> --frames 0-149 |
Render a specific frame range |
npm run typecheck |
Run TypeScript type checking across the entire codebase |
npm run test |
Run the full test suite (125 tests) |
npm run test:coverage |
Run tests with code coverage reporting |
npm run build |
Compile TypeScript to JavaScript with declaration files |
npx remotion studio src/examples/<name>/index.ts |
Open the Remotion preview UI in the browser |
opencut-render --watch auto-re-renders when you save timeline.ts, config.ts, or Root.tsx.beatsToFrames, barsToFrames, and synced volume curves.| Component | Description |
|---|---|
VideoComposition |
Top-level component. Takes timeline, config, subtitles; sequences everything into the final video. |
Segment |
Renders one timeline segment: background (facecam/screenshot/video/slides) + overlays. |
FaceBubble |
Circular face-cam picture-in-picture. Configurable position, size, and filters. |
SubtitleOverlay |
Groups words into 3-7 word phrases, highlights the active word. AI-transcribed captions from Whisper. |
KeywordOverlay |
Large uppercase text with scale-in animation, positioned at the top. |
TitleCard |
Full-screen title overlay with configurable text and colors. |
EndCard |
Full-screen end card with CTA button and URL pill. |
NotificationBanner |
macOS-style slide-in notifications. Presets: WhatsApp (green), iMessage (blue), generic (gray). |
BackgroundEffects |
Generative SVG backgrounds: orbs, particles, grid, waves, dots, vignette. |
TypingText |
Typewriter animation with optional blinking cursor, speed control, and accent glow. |
CrossfadeScene |
Wraps a scene in <AbsoluteFill> with asymmetric fade-in / fade-out opacity. |
AnimatedDiagram |
Step-by-step animated diagrams with reveal timing. |
ComparisonTable |
Side-by-side feature comparison with animated rows. |
AppMockup |
Device frame mockups for app demos. |
AudioWaveform |
Real-time audio waveform visualization. |
VideoBackground |
Full-motion video backgrounds. |
All style props (SubtitleStyle, KeywordStyle, CardStyle, EndCardStyle) are optional overrides on VideoComposition.
Import pure helpers from src/engine/animation:
generateParticles(count, seed) — seeded, deterministic particle arrayupdateParticlePosition(particle, frame, bounds) — frame-based position with wrappingwaveMotion(frame, frequency, amplitude, phase) — sine-wave motionbreathe(frame, minScale, maxScale, speed) — pulsing scale between two valuesglitchOffset(frame, intensity) — randomized offset for glitch effectslerpColor(color1, color2, t) — linear interpolation between hex colorsstagger(index, totalItems, totalDurationSeconds, frame, fps) — 0-1 reveal progress for list itemstypewriter(text, frame, fps, charsPerSecond) — substring for frame-accurate typinganimatedCounter(from, to, progress, decimals) — number interpolation with easeOutExpogetKenBurnsTransform(preset, progress) — cinematic scale/translate for background footagesimulateAudioReactivity(frame, intensity) — frame-pulsed 0-1 value for bass-hit simulationspringPresets — Pre-configured spring physics presets for natural motioneasing — Curated easing functions: easeInOutCubic, easeOutExpo, easeInBack, etc.Import beat-aware helpers from src/engine/MusicSync:
getGridBPM(fps, framesPerBeat) — derive target BPM from grid settingsgetBeatSyncPlaybackRate(sourceBPM, targetBPM) — playback rate to lock music to tempobeatsToFrames(beats, bpm, fps) — exact frame count for N beatsbarsToFrames(bars, bpm, fps) — exact frame count for N bars (4/4)buildSceneStarts(durations, overlapFrames) — compute crossfade start framesgetSyncedMusicVolume(frame, totalFrames, baseVolume, fadeInFrames, fadeOutFrames) — volume curve that fades in at the start and out at the endAdd a backgroundEffect field to any TimelineSegment:
backgroundEffect: {
type: "orbs", // or "particles" | "grid" | "waves" | "dots" | "vignette"
accentColor: "#3b82f6",
intensity: 0.5,
orbCount: 2, // only for "orbs"
particleCount: 40, // only for "particles"
}
Effects are rendered as SVG or CSS layers behind the segment content and animate automatically based on the current frame.
OpenCut supports plugins for custom segment types, background effects, and timeline transformations.
import { registerPlugin } from "./engine";
registerPlugin({
name: "my-plugin",
segmentRenderers: {
"custom-scene": CustomSceneComponent,
},
backgroundEffectRenderers: {
"stars": StarsEffectComponent,
},
transformTimeline: (timeline) => {
// Modify or augment timeline before rendering
return timeline;
},
});
| Hook | Description |
|---|---|
segmentRenderers |
Map segment type strings to React components. Rendered instead of built-in segments. |
backgroundEffectRenderers |
Map backgroundEffect.type strings to React components. |
transformTimeline |
Receive the full timeline array, return a modified timeline. Applied in registration order. |
registerPlugin(plugin) — Register a plugin globally.unregisterPlugin(name) — Remove a plugin by name.clearPlugins() — Remove all plugins.getSegmentRenderer(type) — Look up a custom segment renderer.getBackgroundEffectRenderer(type) — Look up a custom background effect.applyTimelineTransforms(timeline) — Apply all registered timeline transforms.Generate and browse TypeDoc API docs for the engine:
# Generate docs (outputs to docs/api/)
npm run docs:generate
# Serve docs locally on http://localhost:3000
npm run docs:serve
The generated documentation covers all engine components, TypeScript types, interfaces, and utilities exported from src/engine/index.ts — including VideoComposition, Segment, FaceBubble, SubtitleOverlay, KeywordOverlay, TitleCard, EndCard, NotificationBanner, BackgroundEffects, TypingText, CrossfadeScene, animation helpers, music sync utilities, and their associated configuration types.
src/
engine/ # Reusable video components and utilities
types.ts # All TypeScript interfaces
Composition.tsx # Sequences timeline segments + audio
Segment.tsx # Renders one segment (background + overlays)
FaceBubble.tsx # Circular face-cam PiP
SubtitleOverlay.tsx # Word-level captions with active word highlight
KeywordOverlay.tsx # Large keyword text overlays
TitleCard.tsx # Full-screen title card
EndCard.tsx # Full-screen end card with CTA
NotificationBanner.tsx # Slide-in notification popups
BackgroundEffects.tsx # Generative background effects
CrossfadeScene.tsx # Reusable fade-in/fade-out scene wrapper
TypingText.tsx # Typewriter text animation with cursor
animations.ts # Pure animation utilities
MusicSync.ts # Beat-to-frame math and synced volume curves
plugin.ts # Plugin registry for custom renderers
index.ts # Public API exports
cli/
init.ts # Scaffold a new video project
transcribe.ts # Whisper wrapper / subtitle template generator
validate.ts # Timeline and asset validator
render.ts # Render helper with --watch, --preview, --frames
workflow/
loader.ts # JSON/YAML project config loader
validator.ts # Project config validation
generator.ts # Remotion file generator from project config
types.ts # Workflow types
examples/
quickstart/ # Minimal 2-segment example
openslides/ # Product demo video
format-demo/ # Background effects and multi-format demo
ai-engineer-basics/ # Long-form educational video
floom-launch/ # SaaS product launch with beat-synced crossfades
hyperniche-launch/ # Competitive comparison + animated diagram
opendraft-research/ # Educational research with kinetic typography
Create a folder under src/examples/your-project/ with three files:
import type { VideoConfig } from "../../engine";
export const MY_CONFIG: VideoConfig = {
playbackRate: 1.2,
fps: 30,
width: 1920,
height: 1080,
crossfadeFrames: 10,
facecamAsset: "my-facecam.mov",
bgMusicAsset: "music.mp3",
bgMusicVolume: 0.06,
};
import type { TimelineSegment } from "../../engine";
export const TIMELINE: TimelineSegment[] = [
{
id: "intro",
type: "facecam-full",
facecamStartSec: 0,
durationSec: 10,
faceBubble: "hidden",
showSubtitles: true,
showTitleCard: true,
backgroundEffect: {
type: "orbs",
accentColor: "#3b82f6",
intensity: 0.5,
orbCount: 2,
},
},
{
id: "demo",
type: "screen-static",
facecamStartSec: 10,
durationSec: 20,
faceBubble: "bottom-left",
showSubtitles: true,
screenImage: "my-screenshot.png",
keywords: [
{ text: "Key Feature", startSec: 12, endSec: 16 },
],
backgroundEffect: {
type: "grid",
accentColor: "#a78bfa",
intensity: 0.4,
},
},
];
import React from "react";
import { Composition } from "remotion";
import { VideoComposition, computeTotalFrames } from "../../engine";
import { MY_CONFIG } from "./config";
import { TIMELINE } from "./timeline";
import { SUBTITLE_SEGMENTS } from "./subtitles";
const totalFrames = computeTotalFrames(TIMELINE, MY_CONFIG);
const MyVideo: React.FC = () => (
<VideoComposition
timeline={TIMELINE}
videoConfig={MY_CONFIG}
subtitleSegments={SUBTITLE_SEGMENTS}
titleCardTitle="My Product"
titleCardSubtitle="The tagline"
endCardTitle="My Product"
endCardCtaText="Try it now"
endCardUrl="myproduct.com"
/>
);
export const RemotionRoot: React.FC = () => (
<Composition
id="MyVideo"
component={MyVideo}
durationInFrames={totalFrames}
fps={MY_CONFIG.fps}
width={MY_CONFIG.width}
height={MY_CONFIG.height}
/>
);
Place assets in public/ (create the directory if it doesn't exist — it is gitignored), then render:
npx remotion render src/examples/your-project/index.ts MyVideo out/my-video.mp4
timeout 10m npx remotion render src/examples/openslides/index.ts OpenSlidesDemo out/video.mp4
Use timeout to prevent runaway renders on CI or server environments.
| Example | Description | Render Command |
|---|---|---|
| quickstart | Minimal title + end card | npm run test-render |
| openslides | Product demo with facecam + slides | npm run render |
| format-demo | Background effects + typing text + multi-format | npx remotion render src/examples/format-demo/index.ts FormatDemo out/format-demo.mp4 |
| ai-engineer-basics | Long-form educational video with inline panels | npm run ai-basics:render |
| floom-launch | SaaS launch with beat-synced crossfades | npm run floom:render |
| hyperniche-launch | Competitive comparison + animated diagram | npm run hyperniche:render |
| opendraft-research | Research video with kinetic typography | npm run opendraft:render |
We welcome contributions! Please see CONTRIBUTING.md for guidelines on reporting issues, submitting pull requests, and coding standards.
See CHANGELOG.md for a detailed history of releases and changes.
OpenCut is released under the MIT License.
Built with ❤️ by Federico De Ponte