Real Time Mike !

We have had loads of emails asking us to post a clip, so today we released a 30fps real time rendered clip From MEETMIKE here at SIGGRAPH. We will release an in depth story after the show (and we get some sleep). This remarkable journey has been possible by a brilliant group of collaborators (and NINE  PC’s with 32Gig RAM each and 1080ti Nvidia cards).  If you are at SIGGRAPH we are doing Demos each day 10.30 to 12.30, (and if you are lucky pick up a Tshirt).

Video below, RENDERS IN REAL TIME ON ONE PC (30FPS)

MEETMIKE showcases the latest research in digital human technology, with leading industry figures interviewed live and in real-time by a photo-realistic avatar in a ‘virtual set in Sydney’, presented at the VR Village at SIGGRAPH 2017 in Los Angeles.

Crowds on Day 2

At the VR Village, the SIGGRAPH audience get to witness MEETMIKE, fxguide co-founder  Mike Seymour, interviewing leading industry figures, in VR, in real time, in Stereo using Epic’s Unreal Engine.

Mike in the Technoprops Stereo HMC rig

Each day of the trade show digital Mike is meeting digital versions of industry legends and leading researchers from around the world. Together they  conduct interviews in “Sydney” via virtual reality which can be watched either in VR or on a giant screen. The project is a key part of a new research project into virtual humans as Actors, Agents and Avatars.

2K rendered at 30fps

The session will provide valuable data and insights for taking digital humans research to the next level. “The project aims to explore the creation and acceptance of digital virtual humans”. To achieve this VR experience, a stunning nine high-end graphics computers run synced versions of Unreal Engine allowing for the highest quality real-time digital human avatars created to date.

This special event is a collaboration of teams from around the world, in four continents, three universities and six companies along with the Wikihuman global research project. The project involves best of class scanning, rigging and real-time rendering.

Monitor at the show, showing 60fps live (see stats on screen)

This unique experience at SIGGRAPH 2017 involves several of the companies that won last year’s SIGGRAPH 2016 REAL TIME LIVE award. Yet unlike last year, the digital humans presented are rendered at 90 fps in VR in stereo with new technology and fidelity. Additionally, commercial teams from China (Tencent) and San Francisco (Loom.ai) are involved.

Rendered at 30fps above, and at 90fps in stereo at Siggraph

“We are proud to be able to help bring this mix of Virtual Production, Digital Humans and Virtual Reality to SIGGRAPH as part of an amazing global research effort into digital Actors, Agents and Avatars” commented  Mike Seymour. The aim is to research responses to digital human’s which will someday enable virtual assistants to be deployed in contexts from health care, aged care, education and of course, entertainment.

The VR experience runs using Unreal Engine in concert with special HTC VIVE custom hardware. Two avatars will be shown, one is a personalized 3D digital face avatar which will be made for each guest from a single still photograph, using special AI algorithms.

The other was created from extensive face scanning in the Lightstage at USC ICT, with further contributions from research teams such as Disney Zurich Research, who produced the eye scanning data to achieve live, real-time rendered digital face representation. Both the face of the guest and the host (Digital Mike) are tracked and solved in real-time again using Deep Learning tools, from Cubic Motion in the UK.

This is then used to drive a state-ofthe- art, facial rigs which were built to run in real time by 3Lateral. Additional Deep Learning is used for 3D facial reconstruction of the guest’s avatar by Loom.ai.

Guests live on stage include leading industry figures from Pixar, Weta Digital, Magnopus, Disney Research Zurich, Epic Games, USC-ICT, Disney Research and Fox Studios. The event is Live each day of SIGGRAPH at the VR Village. (A full schedule is available at www.fxguide.com/meetmike/)

 

 

The technology uses several AI – deep learning engines in tracking, solving, reconstructing and recreating the host and his guests. “It is truly a team effort with the teams working around the world in places such as North Carolina, Serbia, Manchester, China, San Francisco and Sydney. The research into the acceptance of the technology is being done by Sydney University, Indiana University and Iowa State University” comments University of Sydney Professor Kai Riemer, who leads the academic team collecting research data at the event.

The Project is also unique as it builds on the Wikihuman project which is a non-profit team of practitioners and researchers who are not only researching digital humans but sharing both their findings and most of their data, for anyone to use non-commercially.

Host Mike Seymour was scanned as part of the Wikihuman project at USC-ICT with additional Eye scanning done at Disney Research Zurich. The graphics engine and real time graphics are a custom build of Unreal Engine. The face tracking and solving is provided by Cubic Motion in Manchester. The State of the art facial rig is made by 3Lateral in Serbia. The complex new skin shaders were developed in partnership with Tencent in China. The guest’s avatars are made from single still images by Loom.ai in San Francisco

Technical facts:

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head. mesh.
  •  The system uses complex traditional software design and three deep learning AI engines.
  • MIKE’s face uses a state of the art Technoprop’s Stereo head rig with IR computer vision cameras.
  • The University research studies into acceptance aim to be published at future ACM conferences.
Behind the scenes.

Founded in 1991, Epic Games is the creator of Unreal Engine, Gears of War, Robo Recall and the Infinity Blade series of games. Today Epic is building Paragon, Fortnite, SPYJiNX and Battle Breakers. Epic’s Unreal Engine technology is used by teams of all sizes to ship visually stunning, high-quality games and experiences across PC, console, mobile, VR and AR platforms. Developers are increasingly choosing Unreal Engine for visualization, design, film, television and simulation. Download Unreal Engine for free at unrealengine.com.

3Lateral is built around passion for creating characters and creatures, with a special focus on development of technologies enabling seamless and real-time digitalization of humans, they provided the complex rigging. Cubic Motion is a team of internationally-acclaimed computer vision researchers, engineers and experienced production team. Their innovations include world-leading technologies for precise tracking in markerless video, stereo, and depth data. Tencent / Next Innovation Tencent’s exceptional R&D team of Industry real time and graphics researchers has worked to improve both the detail and realism of the digital facial skin.  The Wikihuman research group is a non-profit group dedicated to the studying, understanding, challenging, and sharing knowledge of Digital or Virtual Humans. Loom.ai is building a machine learning platform for creating personalized and expressive 3D digital face avatars from photographs. They license the API to businesses in mobile AR, messaging, games, social VR, and e-commerce. Founded by Oscar-winning team of visual effects veterans from DreamWorks and Lucasfilm, Loom.ai plans to transform virtual humans without expensive specialist scanning.

Christophe Hery (PIXAR) and render expert.

7 thoughts on “Real Time Mike !”

  1. That was amazing…
    I wonder what PC specs is needed to render something like this at that speed and resolution.

    Also, what customization were made into the Unreal Engine? Experimental shaders?

Comments are closed.