Epic Games has had an agenda to produce real time virtual humans, each year showing progressively more realistic digital people running in UE4. The latest real time example of this cutting edge work is codenamed “Siren,” created by Epic Games, Cubic Motion, 3Lateral, Vicon and Tencent.
State of Unreal
At the State of Unreal keynote on Wednesday morning, Tim Sweeney, Founder and CEO, started by discussing Mobile gaming both at the AAA level and also the growing indie smartphone scene. The market is huge, a 100,000 games are released via app stores, each year. But Sweeney believes there is “currently a flight to quality”, He showcased a number of console quality games on smartphones including Rocket League, PUBG and of course Epic’s own monster hit Fortnite for mobile.
Fortnite on the iphone is the full version of the game and completely compatible with the original PC version. Most significantly, Sweeney highlighted the cross play with mobile players playing alongside friends on PC versions. In Fortnite, players can join friends with individuals all on different hardware platforms.
GDC 2018 is a huge event for Epic. A year ago the company had 3 million downloads of the UE4 engine, this year it has grown to 5 million. Users and developers are using UE4 across a wide variety of industries not just gaming, which has lead to what Sweeny called “full employment for UE4 developers”.
On the Epic GDC stand, (in additional to a mechanical Fortnite Llama (built on a modified ex riding bull), there is a whole area devoted to helping companies connect with potential teams.
The State of Unreal talks also highlighted that Epic is now working with Magic Leap. Epic has released UE4 support for the Magic Leap SDK, along with a custom Unreal Editor, documentation and a sample project to help developers start creating new content using spatial computing.
Paragon Assets were also released to the public. Over $12 million worth of characters and environments from Paragon have been released free of charge for use in any UE4 project.
The initial release is available now in the Unreal Engine Marketplace, and Epic will release millions of dollars worth of even more content from Paragon this spring and summer. This is more than just set of test assets, these production assets will no doubt be reused in new stories and professional projects. A point underlined by Epic showing a clip from a feature length animated film, made by a small team in Pakistan. The Third World Studios film, Gluco, Allahyar & The Legend of Markhor was made with just 60 artists and re-used previously released ex Epic assets such as the terrain from the Epic short film Kite.
Digital Humans: Siren
The history of digital human demos from Epic starts with the Hellblade demo, which went on to win RealTime Live at SIGGRAPH and the actual game has just been nominated for 9 BAFTA Games nominations. At SIGGRAPH 2017 there was most recently digital MIKE based off our own Mike Seymour. Today Epic showed Alexa Lee steping up literally as the newest fully digital real time performer for Siren. The actress is appearing daily at GDC as a a high-fidelity, full performance character driven in real-time in UE4.
In the case of the Hellblade and MIKE demos, the digital human was a virtual representation of the person driving the rig. In Alexa’s case, her performance is re-targeted onto a different digital character, that of chinese actress Bingjie Jiang, who was scanned and modeled last year.
To create the interactive demo, actress Alexa Lee wears a head mounted camera rig and a full body motion capture suit. Her face is animated via Cubic Motion markerless tracking solving a 3Lateral rig. Using Vicon’s new Shōgun 1.2 software, her body and finger movements are captured on one screen while the data was streamed into Unreal Engine using Vicon’s new Live Link plugin. On the output screen, the Siren character, created using the likeness of the Chinese actress, moves in sync, driven in-engine at 60 frames per second.
Siren uses the new Cubic Motion modified Techno-props headgear to do markerless facial tracking. The new rig moves away from a stereo pair of camera in front of the face to a new configuration of side and front cameras. This dual computer vision input is then interpreted into 3D FACS AUs for driving the 3Lateral rig. This approach is producing much more accurate lip sync than ever before, even in the highly challenging realtime environment where the face needs to be tracked, solved animated, blended with the body and rendered in 15 milliseconds.
3lateral in Serbia also handled all the 3D and 4D scans of the actress. (See our separate story on the new innovations at 3Lateral and their separate test with Andy Serkis).
The Siren project began as a collaboration between Epic and Tencent, with the goal of creating a proof-of-concept demonstration to show both the capabilities of UE4, and what the next generation of digital characters will look like. To handle the performance capture for the character, Epic and Tencent utilized Vicon’s Vantage optical motion capture system, Shōgun and VUE video cameras to capture precise and authentic movement and to add the character animations over the reference footage in real time.
To ensure the highest possible fidelity, Vicon solved the body directly onto the Siren custom skeleton, removing the complex targeting step. It also developed a new algorithm to help realistically animate the fingers.
“When we began working on Siren, we knew from the beginning that it was going to push several boundaries. To make this possible we needed the best motion capture hardware and software,” said Kim Liberi, Epic Games’ CTO. “We use Vicon systems on our own stage, so we knew right away that they were the best choice for this project.”
Cubic Motion, who have been a key partner in the earlier Epic UE4 digital human projects also announced today the first step in transforming their business model with the availability of its technology platform for licensing. Cubic Motion’s computer vision technology has been used widely by producers and filmmakers to create digital facial animation, saving the time and cost of digitally animating it by hand. Its computer vision technology tracks more than 200 facial features at over 90 frames per second and automatically maps this data to extremely high-quality digital characters in real-time.
“We are offering the keys to unlock a virtual world, enabling content producers and game developers to more easily interact with our technology and streamline the creation process for performance driven real-time digital humans,” said Andy Wood, Chairman of Cubic Motion. The uses of the new technologies such as the ones demonstrated today will be vast. “CG films and TV productions will be directed in real-time, just the same as it is today with video cameras in the real world. Digital humans can be broadcast into games (facially driven by live actors or ordinary people in real-time)’” Wood added.
“Cubic Motion’s breakthrough computer vision and facial animation technology was one of the keys to our success,” said Libreri. “Creating believable digital characters that you can interact with and direct in real time is one of the most exciting things that has happened in the computer graphics industry in recent years.”
The actual face is rendered at 60 fps in real time and includes subtle facial hair and new advanced skin shaders from the UE4 team. Tencent and Epic’s engineers have introduced new backscatter algorithms and a new dual specular lobes. The glossy specular layer(s) represent light that is reflected immediately at the boundary of the skin. The 2nd specular lobe sits underneath the primary specular lobe and is attenuated by it.
Another example of the improved skin approach is the new screen space irradiance, this means for example, light hitting the skin on the cheek is bouncing from the cheek into Siren’s eye. This provides greater realism and stunning visuals. (This demo does not use ray tracing – see our separate story on the ILM xLAB demo).
For those at GDC this week, you can see Siren, daily at the Vicon booth 241 streaming.