Visual Disruptors Podcast #10: Matt Workman

Visual Disruptors: Matt Workman

In the latest podcast in our Visual Disruptors series, Mike Seymour talks to Matt Workman.

 Listen on Apple itunes here

Matt Workman

Matt is part cinematographer, part game developer, part previs expert. After getting started working on music videos, he honed his camera craft in New York City, shooting commercials and content for clients like BMW, L’Oreal, Google, and Facebook, before turning his hand to software development.

He’s the founder of Cinematography Database, a cinematography media and education company, and the creator of Cine Designer for Cinema 4D, a product for visualizing real-world camera work and lighting in 3D.

In this podcast, Mike Seymour, talks to Workman about his innovative new Unreal Engine-based product Cine Tracer. He discusses how he’s designed the realistic cinematography simulator to be accessible to non-3D users, what it brings to virtual production workflows, and how he sees it developing in the future as hardware and engine technology evolve. You can listen to the full podcast above, or read on for an overview based on the podcast below.

Although Workman’s first software product Cine Designer was for users of 3D content creation program Cinema 4D, with Cine Tracer he’s trying to make a solution for people who don’t have the time or inclination to learn a 3D program. He sometimes describes it as a game, drawing parallels with Fortnite Creative, which helps to make it seem more accessible and less intimidating to non-3D people. The fact that the Early Access version of Cine Tracer is only available on Steam reinforces this perception.

At its most basic, Cine Tracer enables you to quickly and easily build, populate, and light a set, select a camera from a range of real-world replicas, look through it to position it, and take a picture, which goes to a storyboard. What makes it special, however, is that it is remarkably accurate. The built-in intelligence limits you to doing things that are replicable in real life. So, for example, the virtual tripod heads only move in a way that real tripods would do; a camera on a dolly track stays at a fixed height as you move it forward or backwards; lenses have finite zoom ranges; and so on.

 

Bringing believability to scenes with ray tracing

With the advent of real-time ray tracing in Unreal Engine 4.22, the accuracy of the lighting is even more impressive, and that’s something filmmakers value particularly highly since it contributes so much to the believability of the scene.“Cinematographers use light to shape the emotion,” explains Workman. “Now that light behaves as it does in the real world, or much closer to it, you can actually believe the scene a little bit more and it can actually feel like something.”

Conversely, when the lighting is wrong in the previs, the cinematographers can’t engage with it. “One of the big things is light coming through windows into a room; it’s like the aperture of the world,” says Workman. “You’re in an interior, the windows are what bring the light in and so much tone is set by that. If the shadows aren’t correct—coming through the window, and coming through trees, and hitting the wall—the whole thing falls apart. If you showed a cinematographer virtual light like that, instantly, they wouldn’t like it. When you turn this [ray tracing] on, the light coming into the room just feels amazing.”

Populating the scene with props and characters

When it comes to populating the scene, Workman provides a large library of assets, drawing from 3D stock sites like Unreal Engine Marketplace, and building anything specialized himself. Everything is to scale, and materials react correctly to light, making it as easy as possible for users to assemble their sets. There’s also intelligent placement built-in so that chairs will sit on the ground, for example, and object manipulation is very simple.

In future releases, Workman wants to either offer a bridge to a tool like SketchUp, or provide simple modeling tools himself, so that art departments can also use Cine Tracer for set dressing.

Characters are another story.

Workman uses static 3D scans of people, including himself, and draws from widely available resources on the internet. He’s also used assets from Unreal Engine, including the Paragon collection, which Epic made freely available to developers, combined with assets from Mike Seymour’s own digital human project, MEET MIKE.

Further characters come from Reallusion Character Creator; these come with an IK rig, which Workman suggests can be animated using Moka Studio. IK rigs also enable users to custom-pose the characters, and Workman has put effort into simplifying what can be quite a complex process for non-animators, an area he intends to continue to focus on more in the future. He’s also interested in improving the facial posing capabilities with tools like Apple’s ARKit.

Supporting cinematographers’ platform of choice

Workman has early access to the new Mac Pro, coming this fall, and is seriously impressed. While the computer doesn’t currently support RTX graphics cards with real-time ray tracing capabilities, it is significantly more powerful than existing Mac OS offerings, able to handle many more lights, larger textures, much higher-resolution meshes, and so on. And since—according to Workman—95% of directors, DPs, and art departments are Mac users, having a sufficiently performant Mac OS workstation to run Cine Tracer on is critical.

When it comes to hardware, another factor of interest is the controller. As well as of working with a mouse and keyboard, Cine Tracer enables you to plug in a game controller, such as an Xbox or PS4 one. This makes it much easier for non-3D users to animate cameras, as Workman explains.

“Animating cameras is pretty hard in general,” he says. “There’s a lot of nuance in understanding how to animate rotations. What’s very easy and intuitive for most people is flying a drone and understanding how to get a certain path, or playing Fortnite, which is a strafe-based movement system where the left stick is movement and the right stick is turning. That’s pretty intuitive to most people.”

By supporting this kind of camera control in Cine Tracer, any director, creative director, or whoever wants to try moving the camera around is able to learn very quickly how to do it, encouraging collaboration during the previs process.

Looking to the future

So what excites Workman about the road ahead for real-time technology? With the new Chaos physics and destruction engine coming soon to Unreal Engine, users will be able to add even more realism to their scenes, something he is looking forward to.

“I just have to kind of get the serious stuff out of the way and then we’ll be able to smash cars through walls,” he says. “That’ll be fun!”

As far as frame rates, Workman already has more performance than he needs for his purposes. Unlike VR applications that require up to 90 fps, Cine Tracer only requires 24 or 30 fps, and Workman is looking to add a mode to cap rates. In fact, for final-quality stills, the frame rate is almost irrelevant. Users frame their scene with ray tracing and other high-quality settings turned off, then turn everything on and take a still, which may take a few seconds. That’s still blisteringly fast compared to offline rendering, and doesn’t hamper the workflow.

Workman has a history of providing educational content on Cinematography Database, and envisions using Cine Tracer to produce more content in the future. It’s an area he’s had to put on hold while he’s been working full time on the product. Once Cine Tracer is more mature, and RTX graphics more accessible, he looks forward to getting back into that side of things.

“There is no better tool to illustrate onset work, regarding lighting, because lighting is so hard to teach,” he says. “Even if you’re on set in reality, showing people how to light, sometimes you need to be 30 feet above to really see what’s happening. So it’s really the perfect illustration and teaching tool.”

Focusing on performance and narrative

Cine Tracer can help the film production process, especially for live-action projects; by getting the technicalities out of the way before actual shooting begins, filmmakers can focus on getting the best performance from the actors, and ensuring the story comes through.“That whole point of the communication of the story is why we go to films, why we see movies, why we play the games,” he says. “It’s the narrative arc that we care about, but you’re not doing the narrative arc justice if you film it poorly, if you stage it badly, and so you need to be able to do both. By doing something like Cine Tracer, you can have that experimentation time, you can really do whatever it is in your process to get the shots that you want.”

This podcast interview with Matt Workman is part of our Visual Disruptors series with Media partner Epic Games. Visit the Epic Games Virtual Production hub for more podcasts, videos, articles, and insights.

FMX

Matt also spoke at FMX in Germany: Here is his talk