Metahuman creator brings the new option of Mesh to MetaHuman

Following on from the release of UE5 in April, Epic Games has released a major update to its MetaHuman Creator. This new early release allows user to import their own mesh and create photorealistic digital humans, complete with hair and clothing. In addition to the new Mesh to MetaHuman feature, there is support for the new character rigging, animation, and physics features focused around UE5.

The new major feature of the new MetaHuman Plugin for Unreal Engine, is Mesh to MetaHuman. This enables users to take their own custom facial mesh and convert it into a MetaHuman, fully rigged and ready to animate. This overcomes a huge limitation of Metahumans, naturally, since the first release of MetaHumans users have been after tools to match to a real person. Until now the only characters users can create were manual variations from the base MetaHumans. With this new feature, rather than trying to match a face by eye, using only the MetaHuman Creator tools, users can now build a MetaHuman from a scan.

This will open up the use of metahumans enormously. In addition to providing a tool to make the new Metahuman look like a specific person, that same character will now be built on a unified, default rig, enabling all metahuman animations and simulations to be applied without special retargeting. Rigging has always been such an interchange issue, as there is no industry standard. By using the Epic Metahuman as a base, users get vast interchangeability of all the other data and metadata on offer for MetaHumans. This must lead to a huge explosion in digital humans.

The new tools go much further than just an import function. Having imported a mesh users can go further and refine their character in MetaHuman Creator with interactive sliders on facial regions between the imported solution and the closest standard preexisting Metahuman. This is artist-friendly and it allows users to see just how their scan aligns with the closest default MetaHuman character.

We have been working with the Light Stage Scanned ‘Meet Mike’ head, seen here in Maya.

Users will start with a textured mesh, typically created using scanning, sculpting, or traditional modelling tools. Mesh to MetaHuman uses automated landmark tracking in UE5 to fit the MetaHuman topology template to it, combining it with a body type from the MetaHuman options. This template is then submitted to the cloud, where it is matched to a best-fit MetaHuman derived from the Epic database. This mesh is then used to drive the facial rig, while the deltas from Epic’s original mesh provide the offsets to create the unique look of the original scan or model.

Note the regions of the face that allow for secondary adjustment when the mesh is imported.

Having transformed a mesh into a MetaHuman, users can download it, or open it in MetaHuman Creator, where they can preview it through various pre-loaded animations and see it with a range of lighting setups, including 6 new options.

The unlit result

While some elements like hair and skin textures will need to be reauthored or reapplied either in MetaHuman Creator or in another app, Mesh to MetaHuman means users can get a starting point for a fully rigged digital human from a user’s own unique static mesh.

The whole process only takes a few minutes to do and works extremely well. Some things such as hair may need to be redone in MetaHuman creator. But the critical thing is how significantly easier it is to get a likeness. The MetaHuman Creator tools allow a user already to edit a preset character but trying to use these tools alone to make a character look like an actual person was extremely difficult. One can spend hours and not be able to form anything close to what can be done in a few minutes with a scan import. The difference is vast.

The new MetaHuman Plugin for Unreal Engine Is available, and below is a tutorial about how it works.

 

Compatibility with UE5’s new character tools

UE5 came with a set of new character rigging and animation features, including artist-friendly authoring and retargeting toolsets; these now all work with MetaHumans.

MetaHumans are built on top of Control Rig, Unreal Engine’s built-in rigging system that enables users to quickly and easily create rigs and share them across multiple characters. In UE5, Control Rig is now full released as production-ready, and it offers a range of new features such as Space Switching, which enables users to dynamically reparent Controls to suit different circumstances.

Control Rig is integrated with Sequencer, where users can animate characters, save and apply poses with the new UE5 Pose Tool. This includes a number of new facial poses that have also been added to MetaHuman Creator.

UE5 also offers an entirely new artist-friendly toolset for retargeting animations. Since MetaHumans come complete with IK rigs, it’s something they can take full advantage of.

Using IK Retargeter, users can quickly and robustly transfer animations between all MetaHumans and other characters, even if they have different body proportions. MetaHumans are now compatible with both UE4 and UE5 mannequins, opening up access to thousands of existing animations on Unreal Engine Marketplace. For example, one could replace the mannequins in the Lyra Starter Game with MetaHumans.

Another new feature of the IK Rig is that it enables users to interactively create solvers and goals that perform pose editing for one’s skeletal meshes. A common use case is to adjust a character additively while maintaining the existing animation, such as having a MetaHuman always look at a particular target while walking along.

With Chaos now the default physics engine in UE5, MetaHumans ship with active ragdoll physics. This means that they will target believable poses and even try to protect their heads when falling.  There is also an added a new level to the MetaHuman Sample project that demonstrates how to use the new physics assets and transfer animations with an Unreal Engine project.

Also, since Quixel Bridge is now integrated into the Unreal Editor, adding MetaHumans to a UE5 project is much simplier. This adds up to more believable real-time animated characters in less time.

New features for MetaHuman Creator

In addition to mesh import, there are major new fun and useful new features in MetaHuman Creator.

Firstly, there are ten new facial animation loops that users can use to instantly bring a Metahuman creation to life. These are great for evaluating how well a character’s deformations perform with different expressions, without having to download it first.

There are also six new body poses to put a character through its paces—everything from being caught in the middle of a high hurdle to trying out a contemporary dance move. This is also five new facial poses that focus on different emotions.

For crowd work, MetaHuman has more clothing options, including the ability to customize colors of areas within fabric patterns, such as polka dots, stripes, or paisley.

Hair also has much better control and options.  There are 13 new hairdos, together with additional styles of beards, mustaches, eyebrows, and eyelashes for a total of 23 new grooms. There are more color and highlights features, and the ability to separately color different areas of a Mohawk, Faux Mohawk, and Pulled Back styles.


 

New lighting presets for Unreal Engine

In January Epic posted six lighting presets from MetaHuman Creator, designed by Oscar award-winning cinematographer Greig Fraser. This has been expanded with three lighting presets designed in relation to  The Matrix Awakens: An Unreal Engine 5 Experience. Plus, the original six have been updated to be compatible with UE5. You can download the updated lighting pack from the UE Marketplace.