Runway brings precise camera controls to AI videos

3 weeks ago 3

Content creators will have more control over the look and feel of their AI-generated videos thanks to a new feature set coming to Runway’s Gen-3 Alpha model.

Advanced Camera Control is rolling out on Gen-3 Alpha Turbo starting today, the company announced via a post on X (formerly Twitter).

Advanced Camera Control is now available for Gen-3 Alpha Turbo. Choose both the direction and intensity of how you move through your scenes for even more intention in every shot.

(1/8) pic.twitter.com/jRE6pC9ULn

— Runway (@runwayml) November 1, 2024

The new Advanced camera controls expand on the model’s existing capabilities. With it, users can “move horizontally while panning to arc around subjects … Or, move horizontally while panning to explore locations,” per the company. They can also customize the direction and intensity of how the camera moves through a scene “for even more intention in every shot,” while combining “outputs with various camera moves and speed ramps for interesting loops.”

Unfortunately, since the new feature is restricted to Gen-3 Alpha Turbo, you will need to subscribe to the $12-per-month Standard plan to access that model and try out the camera controls for yourself.

Or quickly zoom out to reveal new context and story.

(7/8) pic.twitter.com/dovmMUsGEx

— Runway (@runwayml) November 1, 2024

Runway debuted the Gen-3 Alpha model in June, billing it as a “major improvement in fidelity, consistency, and motion over Gen-2, and a step towards building General World Models.” Gen-3 powers all of Runway’s text-to video, image-to-video, and text-to-image tools. The system is capable of generating photorealistic depictions of humans, as evidenced in the X post, as well as creating outputs in a wide variety of artistic styles.

Advanced Camera Controls arrive roughly a month after Runway revealed gen-3’s new video-to-video capabilities in mid-September, which allows users to edit and “reskin” a generated video in another artistic style using only text prompts. When combined with Apple’s Vision Pro AR headset, the results are striking. The company also announced the release of an API so that developers can integrate gen-3’s abilities into their own apps and products.

The new camera controls could soon be put to use by film editors at Lionsgate, the studio behind the John Wick and The Hunger Games franchises, which signed a deal with Runway in September to “augment” humans’ efforts with AI generated video content. The deal reportedly centers on the startup building and training a new generative AI model fine-tuned on Lionsgate’s 20,000-title catalog of films and television series.

Read Entire Article