AI startup Runway launches app to bring video-to-video generative AI to users

AI startup Runway launches app to bring video-to-video generative AI to users

Runway, the AI ​​startup that helped develop AI image generator Stable Diffusion, launched its first mobile app yesterday to give users access to its video-to-video generative AI model, Gen-1. The app is currently only available on iOS devices.

With the new app, users will be able to record videos from their phones and generate AI videos in minutes. They can also replace any existing video in their library using text prompts, images or style presets.

Users can choose from a list of presets like Runway’s “Cloudscape” or transform their videos to look like claymation, charcoal sketches, watercolor art, paper origami and more. They can also upload an image or type an idea in the text box.

The app will then generate four previews for users to choose from. Once they choose which one they like best, it will take a few minutes to prepare the final product. We tested the app ourselves and found that it took about 60 seconds or more. Sometimes it took two minutes to generate.

Naturally, as with any AI generator, the results are not perfect and often appear distorted or strange. In general, the concept of AI video generators sounds silly and maybe even gimmicky. But as the tech develops and improves over time, it can be valuable to users. For example, content creators can spice up their social media posts.

Regardless, we found Runway’s mobile app to be easy to use and overall fun to mess around with.

Below is an example we came up with using a clip of Michael Scott from “The Office”. The text prompt we entered was “real puppet”.

(Warning: The results are terrifying.)

Image credits: Runway (taken from The Office Clip)

We also tried “3D animation”, which worked fine.

Image credits: Runway (taken from The Office Clip)

Of course, there are some other caveats besides the flaws and distorted faces.

If users want the free version, there is a limit of 525 credits, and they can only upload videos that are five seconds long. Each second of video consumes 14 credits.

In the future, Runway plans to add support for longer videos, co-founder and CEO Cristóbal Valenzuela told TechCrunch. The app will continue to improve and launch new features, he added.

“We are focused on improving efficiency, quality and control. In the coming weeks and months, you’ll see all kinds of updates, from longer outputs to higher-quality videos,” Valenzuela said.

Also, note that the app doesn’t generate nudity or copyright-protected work, so you can’t create videos that mimic the style of popular IPs.

Runway’s new mobile app has two premium plans: Standard ($143.99/year) and Pro ($344.99/year). The standard plan gives you 625 credits/month and other premium features like 1080p video, unlimited projects and more. The Pro plan offers 2,250 credits/month and all of Runway’s 30+ AI tools.

Image credits: runway

A month after Runway introduced the Gen-1 – which launched in February – Runway released its Gen-2 model. Text-to-image models Stable Diffusion and DALL-E, Gen-2 is a text-to-video generator, so users can generate videos from scratch.

Runway is slowly starting to reach its closed beta for Gen-2, Valenzuela told us.

The app currently supports the Gen-1 model, but Gen-2 will soon be available along with Runway’s other AI tools, such as its image-to-image generator.

Meta and Google have both launched text-to-video generators, called Make-A-Video and Imagen, respectively.

Runway has developed various AI-powered video-editing software since its launch in 2018. The company has a variety of different tools in its web-based video editor, such as frame interpolation, background removal, blur effects, a feature that cleans up or removes audio and motion tracking, among many others.

The tools have helped influencers and movie/TV studios reduce the time spent editing and creating videos.

For example, the visual effects team behind “Everything Everywhere All at Once” used runway tech to help create a scene in the film where Evelyn (Michelle Yeoh) and Joey (Stephanie Hu) are in a multiverse where they’re turned into moving. rocks

Also, the graphics team behind CBS’ “The Late Show with Stephen Colbert” used Runway to cut editing hours down to just five minutes, according to art director Andrew Bunetta.

Runway also operates Runway Studios, its entertainment and production division.

#startup #Runway #launches #app #bring #videotovideo #generative #users

Girona's Castellanos makes history against Real Madrid as Man City, Holland await in Champions League

Girona’s Castellanos makes history against Real Madrid as Man City, Holland await in Champions League

McLaren 750S Coupe Front 3/4

2024 McLaren 750S Revealed With 740 HP, $324,000 Starting Price