Real-Time Live Preview: SIGGRAPH 2021

At SIGGRAPH 2021 August 10th 4:30pm – 6pm (PDT) teams gather for this year’s Real-Time Live.

Real-Time Live! is an attendee-favorite, one-night-only event that spotlights the most innovative real-time projects of the past year.  Virtual attendees can watch some the world’s most original, jury-reviewed, interactive projects of 2021. This year’s live event showcases an incredible sample of the latest in real-time, and if you watch Live, you can also vote for this year’s Audience Choice winner.

Chris Evans, Real-Time Live Chair during testing, evaluation and rehearsals with Pinscreen’s Hao Li, (Bottom Left)

Chris Evans: Chair of Real-Time Live

There are seven teams competing this year, selected by a jury. We spoke to Real-Time Live Chair Chris Evans of Epic Games.

fxguide: What has been the level of submission this year? Are we set for a good event?
Chris Evans: We had some amazing submissions this year! We surpassed last year in terms of the number of items submitted, and there is some really cool stuff!

fxguide: What are the implications of COVID-19 and the need to stream video for Real-Time Live?
Chris Evans: COVID and moving the conference virtual has had an interesting effect on Real-Time Live. It allows anyone from anywhere to partake, and also allows people to perform from their own mocap stages and VR volumes. Previously, if you wanted to present a piece where something was driven by an actor’s performance, you had to bring your own motion capture hardware and team to the conference and set it up. Moving to virtual has had a lot of challenges, but it has also been a real democratizing shift.

fxguide: Why do you like Real-Time Live so much yourself?
Chris Evans: I like Real-Time Live because it’s the most prestigious award there is for real-time graphics and interactive techniques, and it’s presented in a live show. You get to see the actual works in an interactive manner and not videos. It also has an element of suspense, people run things on a wide array of hardware and software and it can be a real ‘nail biter’ to have everything go off without a hitch.

2021: Real-Time Live

 

We spoke to two of the teams in the lead up to the event:

NVIDIA’s I am AI: AI-driven Digital Avatar Made Easy

An NVIDIA team will demonstrate how its deep learning-based system can generate, in real-time, lip-synced talking digital avatars from one picture

The Team: Ming-yu Liu, Koki Nagano, Yeongho Seol, Rafael Valle, Jaewoo Seo, Ting-Chun Wang, Arun Mallya, Sameh Khamis, Wei Ping, Rohan Badlani, Kevin J. Shih, Bryan Catanzaro, Simon Yuen and Jan Kautz.

In this entry from NVIDIA, the team will introduce a deep learning-based system that allows anyone to synthesize animated digital avatars from audio or text input. They will demonstrate how their system generates, in real-time, lip-synced talking digital avatars from a single image. Their Real-Time Live presentation aims to show how this technology can enable enhanced video conference experience and accessible avatar creation.

Preview of 2021 Real-Time Live

Ming-Yu Liu is a distinguished research scientist at NVIDIA and a key part of the award-winning GauGAN. The GauGAN program, named after post-impressionist painter Paul Gauguin, creates photorealistic images from segmentation maps, which are labeled sketches that depict the layout of a scene. The program works via a deep learning model developed by NVIDIA Research which turns rough doodles into highly realistic scenes using generative adversarial networks (GANs). GauGAN won SIGGRAPH 2019 Real-time Live for Taesung Park (Ph.D. student at UC Berkeley) and NVIDIA’s team. Ming-Yu Liu was the second author on the original CVPR 2019 paper.

Winners of the 2019 Real-Time Live (Mig-Yu Liu is far right)
Sneak Peak of NVIDIA’s 2021 submission

“The system can be used for Web conferencing and change someone’s appearance,” explains Liu.  “But it can also be used with stylized characters.” although the details are under wraps until NVIDIA goes live on stage ‘virtually,’ at this year’s Real-Time Live event.

This year’s submission is a combination of technologies. It is really an end-to-end demonstration of a series of key technology research projects that have been combined together as part of the end avatar creation pipeline. While not an Omniverse project, it does use several tools which are part of NVIDIA’s Omniverse. What is exciting is that NVIDIA is not trying to show a single new R&D project, nor a prototype new product. The Real-Time Live presentation illuminates a myriad of NVIDIA individual practical tools that users and developers alike can incorporate into their own pipelines. Many of the stages of their demo have separate SDKs and are designed to ‘work well with others’, in a production environment. It is also the case that NVIDIA’s team really had some fun building this submission, which promises more than one surprise or gag! And after all, Real-Time Live has always had a great reputation for geeky humor, bad puns, and fun innovative technology.

Koki Nagano is a senior research scientist, and also a prior participant at Real-Time Live. He is also an author on both the 2021 NVIDIA submission and the 2021 Pinscreen submission below, having worked happily at both companies over the last year. Although all of the Pinscreen work was done at Pinscreen.

In a related SIGGRAPH event:

On August 11th, NVIDIA is presenting its Connecting in the Metaverse: The Making of the GTC Keynote. NVIDIA will present how a small team of artists, engineers, and researchers were able to blur the line between real and digital in NVIDIA’s GTC21 keynote in a behind-the-scenes documentary.

Wednesday, August 11 | 11 a.m. PDT

Pinscreen’s Normalized Avatar Digitization for Communication in VR

Unlike the NVIDIA demo, the Pinscreen end-to-end solution uses a different approach and ends up with an incredibly impressive VR experience using avatars created with a GAN-based framework. The input image here can be of a person smiling or taken in extremely challenging lighting conditions, and their method holds the promise of reliably producing a high-quality textured model of the person’s face with a neutral expression and with skin textures rendered with more even diffuse lighting.

VR framegrab from the 2021 Pinscreen submission

The Team: McLean Goldwhite, Zejian Wang, Huiwen Luo, Han-Wei Kung, Koki Nagano, Liwen Hu, Lingyu Wei, and Hao Li .

Hao Li is CEO and Co-founder of Pinscreen, and was also a part of last year’s winning SIGGRAPH 2020 Real-Time Live with Volumetric Human Teleportation or ‘Monoport’.

From SIGGRAPH 2020

This year the Pinscreen team is going to start by “demonstrating how avatars can digitize somebody from a photo,” says Li. “While that is something that we have demonstrated in the past, – this time we have a new advanced algorithm, on that was only just published at CVPR a couple of weeks ago… and we built from this to an end-to-end solution that allows people to interact directly in an Oculus Quest.

”The team aims to show taking a photo live with a webcam, uploading the photo, digitizing an avatar, and experiencing it in VR within minutes.  The digital humans created have plausibly textured faces, where the likeness of the input subject is preserved and visibly recognizable in VR. But most critically, “when you’re in VR and you’re talking, your face (movement) is derived just from your audio,” he proudly adds. In fact the demo goes further and a participant’s whole-body movements are driven entirely by just the VR headset and the two VR controllers.

Real-Time Live SIGGRAPH 2021 is live August 10th, 4:30pm to 6pm (PDT).