Lenny Charny is the founder and CEO of triMirror, a new real-time live cloth simulator for use on MetaHuman style characters in UE4. The company recently received a MegaGrant from Epic and it is working hard to provide real-time clothes for Epic’s MetaHuman creations. The company provides not only the clothes that run in real-time but also a 3D Draper software program to make new clothes and edit industry-standard file formats to produce flowing cloth that interactively drapes and moves believably.
The system of materials is 100% runtime, and the editing can be done via their UE widget, which will be available in the Editor as well as during a game or simulation. Charny and the triMirror team believe that integrating their 3D Draper with the UE plugin will make it easier for the end-users to create and edit the fabrics because their whole system was designed for rapid cloth prototyping. As you can read below, the company is broadening its focus from vertical integration in the fashion industry to supporting Media and Entertainment applications.
FXG: How do your cloth sims differ from the normal UE4 cloth sims?
LC: Originally we developed our cloth physics for virtual fitting so it can be applied to an arbitrary body’s size and shape. Therefore, the main difference with the UE cloth simulation is in how the cloth interacts with the avatar’s mesh. We do not use any artificial constraints and do not use convex approximations. Instead, we use mesh-to mesh collisions and the distance between the avatar and the cloth can be in the sub-millimeter range. We also support natural friction between the cloth and the avatar, which allows the clothes to stays on the avatar more naturally without sliding from the correct positions (e.g. pants and skirts). We also try to keep the cloth less stretchy (to a certain extent) to avoid a rubber-like effect.
We also support some advanced features such as folds and iron-lines, buttons and zippers on/off.
There is another aspect to clothes presentation such as visual properties for UE rendering, so we combine both physical and rendering material properties for each garment pattern and provide it to the end-user in a single framework. i.e., instead of a UE material model we use our system for a more natural physical and visual simulation.
FXG: This is real-time but is it built on a machine learning solution and thus has training time?
LC: No, we do not use any machine learning.
FXG: Is this an algorithmic solution without ML built on a more classic springs-type cloth sim?
LC: Yes, we use mass-spring simulation.
FXG: You said the system was real-time on playback but can it be a live cloths sim, based on MoCap (XSens style) input – which is solved and used in a live situation, perhaps with a small amount of latency?
LC: The answer is yes as long as they are clean enough. We are keen to test a real-time animation driven by Xsens or just use their MoCap Data.
FXG: How well does your solution deal with collision?
LC: We were trying to achieve a certain balance of real-time performance on consumer-grade hardware. So we have quite a robust avatar-to cloth collision as well as some self-collision that still needs to be improved.
We also have a hybrid of physical and visual solutions such as fabric thickness, bias, and offset, which helps to achieve quite an advanced visual appearance of garment details.
FXG: What are the limits? What can’t you do yet? Have you looked at using it for metal (car) crash sims for example?
LC: No metal collisions. Currently, we have a 32k vertex limit, which historically came from using old GPUs but it is just a matter of adopting our physics to the modern GPUs.
FXG: The demo is really interesting, could you outline what H/W was used and what is required?
LC: For the physics alone we can use a rather modest GPU (GTX 770 or better). However, if one wants to use more GPU-intensive scenes with groomed hair, volumetric clouds, etc., that would require more GPU power. For instance, for the MetaHuman video, we used a separate GTX 1660 for the cloth simulation to show up to three garments and also used an RTX 3090 for the groomed hair and still had some slowing down when we had volumetric clouds.
FXG: What is your target audience, who are you building this for primarily?
LC: Originally, we were targeting fashion designers and vertically integrated brands and retailers (both in-store and online). Now we also see great interest from the movie/animation and game developers’ communities.
FXG: When did you start on this project?
LC: triMirror started over 10 years ago. About 2.5 years ago we started using UE for better fabric rendering and gradually continued with integrating our physics with UE while building a stand-alone UE-based application.
FXG: How many people in the team working on this?
LC: Currently we have a team of 3-4 developers working on this project, but have at times expanded up to a dozen depending on projects and customer needs.
FXG: Where are you based?
LC: triMirror is based in Toronto, Canada but everyone works remotely from different cities and countries.
FXG: What is the greatest bounding aspect, is it GPU ?, Memory ? or perhaps the complexity of the cloths (multiple layers intersecting)?
LC: The complexity of clothes and multiple layers remains a challenge but certainly can be resolved in the future.
We were looking at the possibility to have a UE-based SaaS cloud solution. From a technical standpoint, there are no issues here.
FXG: Given you started with a focus on the fashion industry, can you integrate from typical clothes design programs such as CLO 3D, Marvelous Designer, etc for ePattern delivery, and are you using the DXF-AAMA/ASTM file format developed by the American Apparel Manufacturers Association?
LC: Integration with Clo/MD, Browzwear, or another solution is definitely possible and some companies inquire about it, so this is something for future development.
Indeed, we already use the DXF-AAMA format to import CAD patterns into our 3D Draper application.
However, right now we are focusing on making it possible for animators to create their own garments in UE environment using our 3D Draper software.