Runway Text-to-Video: Fun Gimmick, or the Future of Video Editing?

Runway is a browser-based video editing tool. On the surface, it doesn’t look that different than popular editors like Premiere Pro or DaVinci Resolve. But what sets Runway apart is its AI-powered tools. Sure it has filters, text, multitrack audio, keyframe animations, multiple layers, and basic editing functionality, but its true differentiation is its text-to-video, rotoscoping, and content-aware fill tools. In a recent Tweet, Runway showed a promotional video for their upcoming text-to-video tool. But it remains to be seen whether this will be just a gimmick with a large marketing push, or if it will be a useful tool for editors.

Text-to-Video

Runway’s newest feature, currently available only by signing up for a waitlist, is AI text-to-video. This tool promises to edit and generate videos using descriptions written in natural language. Their promotional video shows what they envision the tool to be able to do. It shows a user writing commands to import a video of a city street, make it cinematic (by applying a color grade), removing an object (likely using their Inpainting content-aware fill tool). The tool is then shown generating images of a lush garden, cycling through various styles. It is unclear whether these are images pulled from an image search, or if they are generated by AI tools like the recently popular DALL-E or Midjourney. The editor is then shown converting basic text commands to actual edits. This apparent ability to understand and act upon natural language prompts is a very interesting and promising leap in technology. It may make advanced editing more accessible for non-professionals.

The first question here is whether this is faster than normal editing. Seasoned editors get faster with practice and by utilizing keyboard and mouse shortcuts, as well as macros. For certain actions, it might take longer to type out the command than to actually complete the action.

The second question (which is tied to the first) is whether these commands are accurate. Even just looking at the commands in this video, it’s clear that there may be a lot of tweaking that might not even save time – and it certainly won’t be as instantaneous as promised.

  • Import – Is the above example importing a stock video, or something that was manually imported? If it’s the latter, did the AI scan the contents of the imported video and add that metadata?
  • Scale in over time – What is the duration of the animation? How much does it zoom? Does it scale toward the center point or a different point?
  • Fade in text – What font type and color is the text? How long is the fade?
  • Blur the background – How much blur is applied? What style of blur is it?

Of the commands listed, the one that seems likely to work as intended is green screening the character. Making the background black and white is also likely easy. But at that point, is it any faster than typing “black and white” into a list of filters and applying it? Maybe it is faster to type out the action to apply something like a blur, and then adjust from that baseline if you’re unhappy with the results.

Overall, the text-to-video feature seems like a very interesting leap in technology. But it is yet to be determined whether this is a cool showcase of AI tech, or if it will help editors speed up their workflow.

Luckily, Runway does offer some other useful tools that might be worth your time. And you can actually use them now instead of getting on a waitlist.

Green Screen

Runway’s Green Screen tool (generally referred to as rotoscoping) has some high-profile users with some significant success stories. One of the most prominent is the graphics team of The Late Show with Stephen Colbert. They claim they have taken their rotoscoping workflow from several hours to just minutes. Rotoscoping is clearly Runway’s strength, intelligently identifying people and objects easily. And if you need to combine that with a simple edit, the tool can do that. It’s easy to imagine Runway as part of a larger editing workflow, but major productions likely won’t be using it as their sole editing tool in the near future.

Compared to other popular rotoscoping methods in software like After Effects, Runway is competitive with even Adobe’s most recent AI tool, Rotobrush 2.0 (check out our video tutorial of Rotobrush 2.0 here!). But an argument could be made for just keeping your workflow in the Adobe environment since both Green Screen and Rotobrush 2.0 have their own imperfections. In the 2021 video below, VFX YouTube channel Corridor tries out various methods of rotoscoping. Even in the video below, it’s not quite the one-click solution that Runway seems to promise. With a moving subject that is changing orientation, it takes several clicks to keep the subject selected. But even with this in mind, it was much faster than the other alternatives, including Rotobrush 2.0. Rotobrush was a very close second, especially compared to the more manual routes, but they still paled in quality compared to Runway. To quote one of the Corridor Crew members: “Usually with AI…you’re trading speed for detail, but with this, you’re getting both.”

Inpainting

The Inpainting feature from Runway is a content-aware fill tool. This means you can select an object in your footage and Runway will remove it while filling in the space based on context. This is a feature present in various software, and it won’t work every time. But it’s a fantastic tool for getting rid of annoying objects and bystanders that wandered into your shot. Just paint over the object you want to get rid of, and the tool does the rest. This kind of feature works great on simple backgrounds with minimal movement, but it can be a lifesaver in many scenarios.

Gimmick or Life-Changing?

With text-to-video still behind a waitlist and with no actual demos out in the wild yet, there are many questions surrounding how effective Runway’s upcoming feature will be. It’s still unclear. But the technology has a lot of potential! It still needs a lot of human input, so it won’t be stealing too many jobs. And how much time it will save will depend on how accurate the tool is. We’ll be waiting with baited breath to see the results when text-to-video goes public, but for now, you should keep a healthy dose of skepticism when watching Runway’s promotional videos.

View next: The First Ever 48MP Camera on iPhone

One Reply to “Runway Text-to-Video: Fun Gimmick, or the Future of Video Editing?”

Leave a Reply