HyperFrames: Claude Code Can Now Write and Render Videos
AI coding agents are good at writing code. HTML is code. Video compositions are HTML. Therefore — AI coding agents should be able to write videos.
That’s the insight behind HyperFrames, an open-source video rendering framework from HeyGen that ships with Claude Code, Cursor, and Gemini CLI skills pre-installed.
The workflow: describe the video you want → agent writes the HTML composition → HyperFrames renders it to MP4 on your machine.
npx skills add heygen-com/hyperframes
Then in Claude Code:
Using /hyperframes, create a 10-second product intro with a fade-in title,
a background video, and background music.
The agent handles scaffolding, animation, and composition. You iterate in plain English like you would with a video editor.
Why HTML as the Video Format
Most video generation tools use proprietary formats — JSON timelines, visual node editors, DSLs that only the tool itself understands. Agents can’t write these reliably because they weren’t trained on them.
Agents have been trained on billions of lines of HTML. HyperFrames exploits this by making HTML the composition format:
<div id="stage"
data-composition-id="my-video"
data-width="1920"
data-height="1080">
<video id="clip-1"
data-start="0"
data-duration="5"
data-track-index="0"
src="intro.mp4" muted playsinline>
</video>
<img id="overlay"
class="clip"
data-start="2"
data-duration="3"
data-track-index="1"
src="logo.png" />
<audio id="bg-music"
data-start="0"
data-duration="9"
data-volume="0.5"
src="music.wav">
</audio>
</div>
That’s a 9-second video composition — a background clip, a logo overlay that appears at 2 seconds, and background music at 50% volume. An agent can write this correctly the first time because it already knows HTML.
Animations use GSAP (or Lottie, CSS, Three.js via the Frame Adapter pattern). The /gsap slash command gives the agent the context it needs to write smooth, correct animations without hallucinating the API.
The CLI Workflow
# Start a new project
npx hyperframes init my-video
cd my-video
# Preview in browser with live reload
npx hyperframes preview
# Render to MP4
npx hyperframes render
# Add pre-built components
npx hyperframes add flash-through-white # shader transition
npx hyperframes add instagram-follow # social overlay
npx hyperframes add data-chart # animated bar chart
50+ pre-built blocks available in the catalog — social overlays, shader transitions, data visualizations, cinematic effects. Add them to a project and the agent can incorporate them.
Requirements: Node.js >= 22, FFmpeg. Rendering is local — no cloud, no API calls, no usage limits.
Agent Prompt Patterns That Work
Cold start — describe what you want:
Using /hyperframes, create a 10-second product intro with a fade-in title,
a background video, and background music.
Turn existing content into a video:
Summarize the attached PDF into a 45-second pitch video using /hyperframes.
Turn this CSV into an animated bar chart race using /hyperframes.
Format-specific:
Make a 9:16 TikTok-style hook video about [topic] using /hyperframes,
with bouncy captions synced to a TTS narration.
Iterate conversationally:
Make the title 2x bigger, swap to dark mode, and add a fade-out at the end.
Add a lower third at 0:03 with my name and title.
The agent edits the HTML. Preview updates live in the browser. Render when satisfied.
Why This Is Significant
Video generation has been a tool-use problem for AI. Text-to-video models (Sora, Veo, Seedance) generate pixels directly but give you limited compositional control — you can’t say “add a lower third at 3 seconds with this exact text in this exact font.” They’re probabilistic, not deterministic.
HyperFrames takes the opposite approach: the agent writes the composition as code, and a deterministic renderer converts it to video. Same input = identical output, every time. That makes it suitable for:
- Automated content pipelines — generate product videos from a database of SKUs
- Report-to-video — convert analytics dashboards or CSVs to animated presentations
- CI/CD video — auto-generate release notes videos on every deployment
- Personalized video at scale — generate thousands of personalized videos programmatically
The HeyGen connection makes sense here — HeyGen is the leading AI video platform for business, specializing in personalized and localized video at scale. HyperFrames is the open-source composition layer that sits underneath.
Resources
- GitHub: github.com/heygen-com/hyperframes
- Docs: hyperframes.heygen.com
- Catalog: hyperframes.heygen.com/catalog
- npm: npmjs.com/package/hyperframes