NVIDIA is playing a critical role in our industry, as the company has been leading the field in real time ray tracing combined with machine learning support. Until very recently, NVIDIA’s CUDA-compatible cards were required to run apps like Redshift, Octane  (v7.5 is NVIDIA RTX only) and Thea Render. Some of these are now porting back to Mac with the announcement of the new Mac Pro.

Perhaps the best example of the power of the RTX real time ray tracing pipeline can be seen by looking at what NVIDIA has been able to do for its own product launches. It highlights what RTX is adding in terms of a real time ray tracing pipeline to both productions and gaming.

Real time NVIDIA ray tracing

When the RTX was first introduced at SIGGRAPH 2018 in Vancouver, in addition to the impressive Epic UE4 Star Wars demo and the Mercades car demos we have covered here at fxguide, NVIDIA themselves produced their first Project SOL video. It was a ray tracing RTX cinematic that showed detailed reflections bouncing off an assembling reflective exosuit. That project and the team that made it, was put together by NVIDIA’s Rev Lebaredian, VP Simulation Technology and Gavrill Kilmov, Design Director on the project. There have now been three SOL demos and they remain one of the best demos of the power of NVIDIA’s rendering on the RTX cards.

“The SOL project originally was where we started when we did our first cinematic. Jen-Hsun Huang (NVIDIA CEO) asked for a spectacular RTX launch. Originally we never intended it to be in three parts. SOL, part one, was supposed to be just a cinematic where it focused mostly about him getting dressed up in the exosuit, but our CEO really likes funny things and has a great sense of humor. And so under his guidance, we modified the ending to finish on a funny note,” explains Kilmov. At the end of SOL one there is a gag where we suddenly find the robots dancing. The second SOL our hero gets stuck in the ground.

For the third SOL, our hero thinks he’s pretty cool, until he meets his match.

UV as part of the Maya pipeline that feed into the real time pipeline

The SOL projects were produced by a small team of just eight or nine artists. who worked in a very short production window, just two months before the RTX launch. “From my previous experience working on similar projects in an offline pipeline such as at Blur, this would have easily been a three or four month project, with perhaps twice the amount of people,” commented Creative Director Kevin Margo. “It was pretty miraculous to see how quickly the team was able to execute this.”

The SOL demos were predominantly motion captured. The team embraced the motion capture for its speed compared to key frame animation, but “the biggest thing that impressed me with this real time pipeline was that we wound up essentially lighting 65 shots in just a couple days right before the delivery at the GTC 2019, which is pretty crazy,” adds Lebaredian, “and that’s the greatest contrast or example of what an offline versus real time pipeline can do for production process.” For comparison, arguably it would not be unreasonable in an older offline pipeline to schedule four to six artists, over four to six weeks to just work on the lighting, rendering and compositing to produce something at a comparable quality level to what the NVIDIA team executed in just a couple of days. Instead NVIDIA’s Lighting Lead Jacob Norris could, “cycle through, shot after shot and then in minutes establish the foundational rim lights and fill lights and very quickly the depth of field – all in real time. That was pretty inspiring,” concludes Lebaredian.

Close ups of the two figures

NVIDIA is also significant for what it is doing in the USD space. At GDC in March, they launched the NVIDIA Omniverse, which is a professional tool to allow multiple users in a range of applications to easily share USD assets across a range of industry standard applications and interact with those assets in a real-time ray-traced viewport. Look for our special Omniverse story soon on fxguide, where we speak to the developers about that landmark innovation.

 

NVIDIA Cinematics Team:
Alessandro Baldasseroni – Lead Character Artist
Fred Hooper – Lead Visual FX Artist
Gregor Kopka – Lead 3D Artist
David Lesperance – Lead Environment and Lighting Artist
Jacob Norris – Lead Environment and Lighting Artist
Brian Robison – Lead Layout and Animation Artist
Ilya Shelementsev – 3D Artist
Gavriil Klimov – Design Director
Kevin Margo – Creative Director
Temis Nunez – Creative Director
David Wright – Executive Creative Director
Rev Lebaredian – VP Simulation Technology