Ahead of NAB 2019, Adobe has announced product updates and new innovations for video and audio tools in Adobe Creative Cloud. These include some much anticipated things such Content-Aware Fill for video, and some lesser high profile, yet still key announcements, such as external GPUs and Twitch Trigger support.
Content-Aware Fill for Video
Users can now use Content-Aware Fill to remove unwanted objects from footage in After Effect. Things such as signs, boom mics, logos, and even people, can be removed without flicker or complex roto.
The process uses Adobe’s Sensei, artificial intelligence engine (AI) and machine learning technology to remove and replace unwanted objects. While it is not possible to always know what was being obscured, the program plausibility fills in the holes created as objects are removed. If it can, the program is informed by the frames surrounding the current frame, if not, it uses the same approach as the static content-aware fill, but significantly it does this without flicker. It is this temporal cohesion that makes it so powerful.
Formerly known as Project Cloak, the AE content-aware fill only requires a rough roto of the items that you want to remove. The user only needs to set up a garbage mask and not a detailed roto of the item to be removed. While the processing is not real time, the computation can be done in the background while you continue to work in the foreground. The process also adjusts the lighting to match the changing tones of the filled area.
It is also possible to paint example frames to guide the program in a particular direction of how to tackle the fill, thus directing the engine.
Since this technology was first shown as a tech demo (2017, see our coverage here), there has been a huge demand and expectation on Adobe to get the video version of content-aware fill in the hands of artists.
Twitch Extension Triggers
A smaller announcement was also made this week, which may prove to be part of a very significant shift in media and streaming content.
Adobe and Twitch have released the Character Trigger by Adobe extension that enables Twitch players and viewers to extend their interactions beyond chat. With the new extension, viewers can now trigger animated activity instantly on a streamer’s avatar in a Twitch broadcast, from on-the-fly costume changes, impromptu dance moves, signature gestures and poses, and more.
Why is this significant? It means Adobe is allowing users to cause animation in a live stream. This has huge ramifications. It turns a one sided video stream into an interactive session. As of today the extension is linked to Adobe’s real-time animation tool Character Animator (used on TV shows like “Our Cartoon President”). The new Character Trigger by Adobe extension is launching as part of the newest release of Character Animator, Find the extension here.
This new extension, and this model of interaction, unlocks a new way for players and online presenters to monetize their channels. For example, viewers of Twitch stream can use Bits to trigger these actions in real time, from costume changes to character movements. As such a viewer ‘donation’ cause additional animation, giving viewers not only interactive participation but also a form of instant gratification. Instead of just a stream of emotion icons such as hearts and thumbs up streaming up the side of a live stream, or comments off to one side of the main window, viewers can make animation happen almost instantly in the Twitch session.
Is this not what every Virtual YouTuber will want?
Such a system allows internet celebrities to provide both a means of immediate revenue and a more engaged experience for fans, thus vastly increasing engagement.
Adobe’s Character Animator is a great delivery tool for this new form of interaction. It is easy to setup, it is designed to run in real time interactively, and the program already works on a series of triggers.
Last October, the Character Animator team asked Twitch Streamers to join a private beta as they tested the new extension that let Streamers interact with their viewers on a whole new level using live animation. Today, the new Trigger is available for Twitch Affiliates and Partners, and brings more interactivity to the Twitch experience. But dont be surprised if this new way to interact with viewers beyond just chat, does not take off and become one of the next huge trends in content, monetization and internet sensations. This, (or a version of it), may soon spread to countless other online streaming experiences.
Adobe extends external GPU use.
At fxguide we have been test driving the eGPUs from BlackMagic Design. In November the company announced the second version, the Blackmagic eGPU Pro. The new external graphics processor features the AMD Radeon RX Vega 56 graphics processor. While looking identical to the first model, the new version is much more powerful, yet still dead quiet to use, thanks to great industrial design. As with the first version, the eGPU Pro was designed in collaboration with Apple. The Blackmagic eGPU Pro delivers nearly twice the performance of the original Blackmagic eGPU model and up to 22x faster performance than the built-in graphics on a 13-inch MacBook Pro.
Until recently the real benefit of the unit was most apparent when running the company’s DaVinci Resolve. With the eGPU Pro, customers running DaVinci Resolve on a MacBook Pro get enormous speed improvements on GPU-intensive operations, such as noise reduction, some over 20x faster.
While it is also a good option for VR projects/headsets, and also for driving a 5K display from the DisplayPort, as of this new release, Adobe’s products will take much more advantage of the external GPUs. Adobe is providing dual GPU optimizations and improved hardware acceleration for HEVC and H.264 formats in Premiere Pro. In After Effects, it now includes GPU-accelerated effects, such as Change Color and Roughen Edges.
While there is no doubt the eGPU Pro works extremely well on DaVinci Resolve already:- being able to accelerate both Premiere and Resolve will significantly change the workflow for those editors doing round trips to Resolve from Premiere, and yet want the flexibility of still using a laptop as their primary machine.