Epic’s Unreal Real Time Ray Tracing: GDC Day2 Part 2

At GDC Epic showed it’s newest rendering capabilities with a stunning real time ray tracing demo using Star War’s characters.


This remarkable demo sees a short scene full ray traced in real time of Captain Phasma (played in the Star Wars films by Gwendoline Christie) travelling with two stormtroopers in a lift. While the sequence in no way connects to any part of the Star Wars movie world, the amusing short shows off remarkable technical advances.

Click for a full resolution

Phasma distinctive armor of salvaged chromium is the perfect vehicle to see the ray traced reflections and lighting details of the new real time ray tracer. It was directed by Gavin Moran, Creative Cinematic Director at Epic Games. The lighting used area lights and bounce cards to produce cinematic lighting.

While hard to see in this image, the camera view of the action can be controlled by an ipad Pro as part of an upcoming tool in UE4

The live demonstration of real-time ray tracing in UE4 is running on the new Microsoft’s new DXR framework and NVIDIA RTX technology with Volta GPUs running on an NVIDIA DGX Station.

This demo was entitled Reflections, and was shown with a tech lighting demo that adjusted area lights and moved them around a scene showing interactively the ray-traced effects including shadows and photorealistic reflections rendered in real time. This demo is a collaboration between Epic’s ray tracing experts, NVIDIA GPU engineers and the creative team at  ILMxLAB.

Patrick Kelly and Juan Cañada

The system uses the NVIDIA DGX systems which are built on 4 new, revolutionary NVIDIA Volta GPU platform. These powerful engines are combined with innovative GPU-optimized software and to deliver the groundbreaking results.

Epic Games has also been adding to its in house talent in the area of ray tracing hiring two particularly important new Senior Rendering Engineers, Juan Cañada and Patrick Kelly,  Cañada was formerly head of visualisation and the Maxwell rendering team at Next Limit. Kelly joins Epic from the Disney Hyperion team and before that Weta’s Manuka Renderer.

“They just dropped us straight in, we started and they introduced us to this new tech that was coming, –  it was incredibly exciting, – it has been compelling from the beginning ” commented Kelly.  “Just image,” jokes Cañada, “someone tells you on your first day in the office that you going to work with Lucasfilm, doing real time ray tracing – it can’t get any better”. The two joined the growing team of non-game senior researchers and engineers who have been recently joining the seasoned and extremely experienced games engineers and specialists already at Epic Games such as Marcus Wassmer, Brian Karis, and Arne Schober.  “We have been waiting for real time ray tracing for a decade so it is great that it is here now”, comments Kelly who was particularly impressed with the work NVIDIA has done with their GameWorks ray tracing noise reduction, “It really helped the demo, we worked a lot with their filter”, he adds. “Yes, thanks to their filter we were able to keep the ray budget to a reasonable size… that filter and some other innovative work our team did” adds Cañada. Two other key researchers and developers on the project were Uriel Doyon and Guillaume Abadie. Doyon helped significantly with the multi-GPU and optimization effort which was critical for the required high end performance. Abadie did all of the new DOF work with allows for different Bokeh and more realistic fall off as seen below in the lights behind the storm trouper.

The great new DOF

The real time UE4 demo does not do the same number of rays per pixel that a feature film renderer does, but a feature film renderer takes much longer than 30 milliseconds a frame to produce 2K frames. “We are converging an order of magnitude faster than what was possible before”, adds Kelly referring to these new algorithmic improvements in low noise ray tracing. The demonstration was quite quick on stage but the material stands up to close inspection and the tech demo which shows the area lights and reflective properties being adjusted on the fly is perhaps even more illuminating that the real time rendered cinematic. “It does go by quickly but if you play it in real time and zoom into the cinematics, you get really amazing quality, it really does hold up” Cañada comments proudly.

An iPad running ARKit was used as a virtual camera to draw focus to fine details in up-close views. Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, and two stormtroopers who run into her on an elevator on the First Order ship.

Next-generation ray tracing rendering features shown in today’s demo included:

  • Textured area lights
  • Ray-traced area light shadows
  • Ray-traced reflections
  • Ray-traced ambient occlusion
  • New cinematic depth of field (DOF)
  • NVIDIA GameWorks ray tracing denoising

Epic is not shipping Ray Tracing in the short term, but this demo will make developers extremely keen for its eventual release. Not only are the images clean and relatively noise free but the demo showed off a highly impressive new depth of field that made the images look incredibly cinematic.


“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.


Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Microsoft has just announced a raytracing extension to DirectX 12, called DirectX Raytracing (DXR), here at the 2018 GDC.  “Real-time ray tracing has been a dream of the graphics and visualization industry for years.  It’s been thrilling to work with the talented teams at Epic and ILMxLAB on this stunning real-time ray tracing demonstration,” said Tony Tamasi, senior vice president of content and technology at NVIDIA. “With the use of NVIDIA RTX technology, Volta GPUs and the new DXR API from Microsoft, the teams have been able to develop something truly amazing, that shows that the era of real-time ray tracing is finally here.” The goal is not to completely replace rasterization… at least not yet. This effect will be mostly implemented for effects that require supplementary datasets, such as reflections, ambient occlusion, and refraction. But for many real-time ray tracing seemed a far off dream, today it seems both feasible and highly effective.