At Siggraph in July, the EPIC team, primarily made up of those who had done the GDC/FMX presentation of Senua, did a new complex piece which was performed, captured, edited and rendered live on stage in minutes in front of over 2000 attendees. This process would have once taken months.

Having now won the award for Best Real-Time Graphics and Interactivity at a SIGGRAPH Live event, Ninja Theory have launched Senua Studio, a new division dedicated to offering solutions for live capturing and rendering of realistic digital characters in virtual worlds (see below).

Sig-25

Commenting on the announcement, Tameem Antoniades, co-founder of Ninja Theory said “We have demonstrated working solutions today, and want to push forward the future of real time entertainment, be it live performances of digital characters for stage or broadcast, shooting fully-rendered CG scenes in real time for previs, or interacting with believable characters in VR.”

SIGGRAPH 2016

House of Moves and Ninja theory
House of Moves and Ninja theory

Attendees at SIGGRAPH witnessed a groundbreaking real-time cinematography demonstration during Real-Time Live!  “From Previs to Final in Five Minutes: A Breakthrough in Live Performance Capture” reunited Epic Games, Ninja Theory, Cubic Motion and 3Lateral for an emotionally layered performance of Hellblade: Senua’s Sacrifice. Using UE4, the teams were able to shoot, edit, and render an entire scene in just minutes with the engine’s Sequencer tool.

This project won SIGGRAPH’s coveted Best Real-Time Graphics and Interactivity award, the same honor Epic received for A Boy and His Kite last year. Watch the full development documentary and read more on the Unreal Engine blog.

Real Time LIVE - in Detail

The SIGGRAPH presentation was a breakthrough in live performance capture. The demo consisted of multiple passes of live capture and rendering. This was key in terms of showing how quickly Melina Juergens could act to a virtual version of herself.

In pass one Juergens acts out her base performance. In pass two she reacts to the base performance and acts so we the audience see her talking to a ghostly version of herself. The third version is an edited version of the action, where a virtual camera can 'film' the action.

On the first pass the camera itself had a dot or mark on it for Juergens' eye line. In effect the camera is the other Senua. The director Antoniades saw on his 'virtual' camera the Senua character in realtime, in costume, but of course with two Senuas talking to each other we the audience also want to see a third wide view of both characters. Thus on the initial passes the director's virtual camera is an eyeline tool, and on later passes it is virtual cinematography on the blocked scene.

Much of the rig that Epic's Michael F Gay showed was set up with a similar pipeline approach into the Unreal Engine as the team had been shown previously. But at SIGGRAPH most of these stages had been improved or tweaked. There was a major difference with the head rig. At GDC there had been a single camera that was resolving based on a virtualized single camera interpreted from the stereo training data.  At SIGGRAPH there was a stereo input camera pair.

"The facial stuff is way better, we are getting gen-locked stereo frames at 30 fps," explains Epic's CTO Kim Libreri. Interestingly, as can be seen below, the two cameras were mounted one above the other and not side by side, something Libreri thought was much better as it avoided occlusions on either side of the face but still provided jaw depth information, lost on a single camera. "That was really smart. Most people's stereo rigs, including the one's I've used in the past, are side by side, but then you get occlusion and shadowing. This way, based on the shape of people's faces you get very little occlusion. So you get good stereo as both cameras are seeing the same data and you can still extract all the face depth information you need."

Note the top bottom Stereo approach to the cameras
Note the top bottom Stereo approach to the cameras

What was also different in this demo was that the mobile Xsens body rig was not used. House of Moves in LA provided a "blade into IKinema this time, LiveAction, not the mobile motion capture rig we had at GDC, which worked great - but this is more stable and allows us to do more," explains Libreri. "We wanted a bigger volume, more accuracy and really stable - because of the closeups". Antoniades added that "it also allowed us to capture the camera." While the Xsen rig worked well at GDC and FMX, the camera move was a semi pre-programmed fly-around that had automatic offsets to keep Juergens face in frame. At SIGGRAPH, Antoniades was holding a full 3D live tracked camera (vcam) and so a new solution for the capture volume was needed.

Cubic motion provided an updated new face solver. The first solver taught the Cubic team quite a lot, and now having stereo data it was more accurate. All of which means a truer capture of Juergens and a more expressive Senua for the audience. There was also improved tracking so during this performance, unlike GDC there were no special dots of makeup. Previously Juergens had worn special lipstick and had a small number of dots drawn on her face.

While it may appear that Senua II is in a mirror, it is actually a glass divide or rather a shader to give the appearance of glass, and all the set we see is actual geo to the engine, not a reflected version of Senua I's geo. In other words, the engine has got two rooms worth of geometry and an illusion of glass in-between.

House of Moves had a calibration step before each major setup or practice to make sure the Blade skeleton was OK and the IKinema retarget in UE4 was correct. If you look closely, you can see that Juergens did not share the exact same proportions of the Senua, so there is small but important re-targeting happening. This was used two days before the SIGGRAPH demo when Libreri reviewed the rehearsal and felt that Senua's teeth were just a fraction too pronounced. The team therefore dialed down the remapping very so slightly to give a less toothy performance especially in the upper lip.

This level of control and subtly for a live real time presentation is staggering. While fully animated projects may render for hours, the Real-Time Live demo of multiple characters (multiple Senuas) was being virtually filmed in real time in front of an audience of thousands of experts. Far from being critical, the hall erupted with applause at the incredible presentation. While there was a good time gap between FMX and SIGGRAPH, the actual rebuild of assets and capture was done in just 8 weeks.

Sig-32
The first Senua is ghostly white (as she is in the mirror/glass)

In pass 3 of the demo, the Unreal Engine Sequencer was shown. With real-time cinematography every nuance of the digital character’s facial expressions, lighting, visual effects and sets are visible in real time to final render quality. Rather than capturing to film, everything is rendered as digital 3D data directly into Unreal Engine 4 using its powerful new Sequencer tool.

Following capture, the scene can be edited, played back in real time, exported to offline 3D applications and output to video at any resolution or published to virtual reality. The technique has the potential to profoundly impact the creation of movies, games and VR experiences in the near future.

The GDC/FMX demo did not use the sequencer as part of the live demo, although it was used after the demo for PR material. This version of the live demonstration was captured and edited in the Unreal sequencer, including audio and vision adjustments that recorded the audio, body and face. There is no audio drift any longer - a setup like this can all be recorded as if it was being done with a professional 'real' camera.

Sig-29
Sequencer live

One of the examples of the benefit of doing these real world test demos was illustrated by the Sequencer. As the team used the software in initial rehearsals, they noticed how many button clicks were required to do the operation. So the team re-wrote part of the code to streamline the process and make it faster and simpler to use.

Sequencer is not an editor of recorded clips it is a live remixing viewer of real time rendered assets. You can use it to mix emotional aspects of a performance as easily as a video editor cuts filmed clips, since everything is live and realtime of course.

In the future, EPIC plans to expand the Sequencer even further to handle alternate video tracks, importing more clips (multiple video inputs), working with timecode, and more. "You'll be able to import HD video tracks use them as textures on cards or whatever (and) edit them as full frame video. We may even add a cache scheme.," explained  Kim Libreri. The cache idea helps address the issue of wipes or dissolves in the sequencer. Unlike any editor working with normal footage, dissolving between two shots in UE4 requires two full sets of geo to be rendered at once and mixed on the fly. Also as the main audio framework in UE4 is currently being worked on, more audio tools will be naturally appearing in Sequencer over the next few releases.

3Lateral's facial solve rendered in Unreal showed a greater range of emotional performance at Siggraph, as you can see in the full clip below. By the end of the performance, Juergen, playing Senua, is screaming and on the floor. This showed a much greater range of motion and most importantly range of emotion than the previous demo.

3Lateral also improved Senua's cracking make up. The first makeup looked more like an oil paint that moved with her skin, the new makeup cracks and has a more realistic dried look.

 

Presenting the live exhibition at SIGGRAPH was Michael Gay, director of cinematic production at Epic Games, who commented that the team was helped by NVIDIA who lent them the new Titan cards for their demo.

As mentioned above one, new aspect was the move to a full capture volume. We spoke to Brian Rausch, CEO of House of Moves who, in addition to providing the actual capture volume in the Hall at SIGGRAPH, also hosted the rehearsals in the days leading up to Siggraph at their LA offices.

House of Moves is a world leader in capture volumes. They are a full-service animation production studio with over 20 years experience in entertainment, gaming and commercials. House of Moves’ expertise spans over many markets, with thousands of successfully completed animation projects including: video games, TV commercials, feature films, broadcast television series, virtual reality, and online character animation/content as we outlined in 2013 at fxguide.

Rausch explained that while they have their own pipeline, their role was to help expand the range of what could be captured including the Vcam. Rausch commented that House of Moves often works with Xsens, but for this demo, a different solution was required and they decided to use their Vicon system. Using their T160, 16 megapixel high resolution cameras brought the noise down.

These feed the House of Moves Blade system, which can track 55 markers. The Blade system allows the actual frame rates on face data and body data to be different but clearly still in sync and with equal (very small) delays. But just like the previous pipeline the data fed into the excellent IKinema for the retargeting, (although that is not the solver they normally use in house). "We are trying to show what is a core piece of virtual production, namely an actress delivering a core piece of dialogue and then being able to play off that" explained Rausch. "She is having a split personality moment. She is playing off herself and in a way that the director needs to know he has the shot and performance he wants."

 

Above is the full presentation of the SIGGRAPH REAL-TIME LIVE.

 

Tameem Antoniades of Ninja Theory said, “Our end goal is to find ways to create fully interactive 3D experiences for future games and virtual reality experiences that feature incredible, immersive worlds and believable characters. This amazing collaboration between our teams has brought the dream a huge step forward to everyone’s benefit.”

Cubic Motion CEO Dr. Gareth Edwards stated, "The value of the new system launched at SIGGRAPH lies in live digital humans for virtual and augmented reality. Epic's Unreal Engine has been key to making this technology produce the incredible results we’re all so excited to present – and it’s been a pleasure to work alongside their team, led by Kim Libreri. Ninja Theory's compelling world of Hellblade makes a great platform to demonstrate believable digital humans, based on the world-class scanning, modeling and rigging of our friends at 3Lateral. Live-driving is an excellent way to visualize performance on set, and stereo solving pushes our quality bar even higher."

Cubic Motion Chairman Andy Wood stated, "This award to the 'Dream Team' at SIGGRAPH 2016 proved beyond doubt that animation in real-time paves the way for believable digital humans in virtual reality and augmented reality. This changes the way video games, TV programs and films are made forever."

“We’re honored to receive the Best Real-Time Graphics and Interactivity Award for the second year in a row here at Real-Time Live! during SIGGRAPH 2016,” said Libreri. “This demonstration required the teams at Epic Games, Ninja Theory, Cubic Motion and 3Lateral to challenge themselves both technically and creatively to think about the future for real-time cinematography and deliver a working example of what is now possible through the power of Unreal Engine 4 and the Sequencer tool.”

Sig-23

Unreal Engine 4.13

The latest version of Unreal Engine features the Sequencer, the new non-linear cinematic editor, has been updated with a slew of new features for high-end cinematography. Live recording from gameplay has been significantly improved. And now you can transfer shots and animations back and forth from external applications. Many of these features were shown in the SIGGRAPH Real-Time Live! 2016 demonstration.

Importantly Alembic support has been added which means now one can import complex vertex animations. Also the new Physical Animation Component lets characters respond realistically to physical forces by driving their skeletal animation through motors.

Senua Studio

Ninja Theory has launched Senua Studio, a new division offering specialised services to bring realtime virtual characters to life for stage, film, broadcast, games and VR.

Companies working with Senua Studio can rely on an entire technical pipeline - including the creation of realistic digital characters and virtual sets. The team makes innovative projects accessible to all, without the need for internal expertise in cutting edge animation technologies.

Nina Kristensen, co-founder of Ninja Theory and CEO added “With Senua Studio we are offering our considerable technological, artistic and production expertise to visionary partners excited about the impact realtime technology could have for their business”.

Senua Studio is currently operating and offering services for:

  • Live Performance: Realtime technology and expertise allows actors, music artists and performers to live drive digital characters on stage or in broadcast and interact with a live audience.
  • Realtime Cinematography: Expertise in shooting, editing and rendering final or near-final quality cinematics for games, broadcast or film in realtime, rather than the weeks or months it takes with traditional methods.
  • Pre-Visualization: With realtime cinematography, directors can see unprecedented levels of pre-visualization quality including full facial performances of the actors, lighting, vfx and camera shots. An added benefit is that the assets are optimised for use in VR experiences.
  • VR Experiences: Combining expertise in realtime virtual human technology and 16 years of games creation to deliver truly immersive VR experiences offering full interactivity not possible with offline video.

Also Ninja Theory has combined their expertise in realtime virtual human technology and games experience to deliver make an immersive VR experience. Below is a 360 Youtube clip, but the full VR version tracks your head relative to Senua, and as you move she maintains eye contact with special procedural controls. At first she just follows you with her eyes, but if you move your head more, she turns her head to follow.

"It's freakingly good" joked Libreri. "I can't wait to see Senua interviewed at some point in a VR environment, - I just love where this is going in terms of character ...- the thing that excites me is the new forms of entertainment because things will be live and Interactive".

 

 

 


Thanks so much for reading our article.

We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.