The Mill’s incredible Blackbird at GDC : The future of virtual production

Filming cars was never going to be same again, after The Mill introduced the Blackbird. Their car invention was truly an incredible cutting edge production tool. It addressed a fundamental problem: automotive car availability on location. But when filming with the Blackbird, the problem was trying to make decisions creatively without the actual target car visible on set…

Today at GDC in San Francisco, Epic games and The Mill showed an incredible solution to this problem, one that has vast implications for not only virtual production, but consumer car marketing in the future.

Epic Games, and The Mill  joined forces with Chevrolet to revolutionize the virtual production with “The Human Race” live rendered short film.

Combining an advanced implementation of Epic’s Unreal Engine with The Mill’s proprietary virtual production toolkit, Mill Cyclops, “The Human Race” merges real-time visual effects and live-action storytelling. The combined technologies were pushed beyond the limits of existing real-time rendering capabilities to produce a futuristic film that features the 2017 Chevrolet Camaro ZL1 in a heated race with the Chevrolet FNR autonomous concept car.

Live at GDC

The only physical vehicle filmed for “The Human Race” was the Mill Blackbird, a fully adjustable camera car that enables The Mill to insert almost any car model into any filmed environment.

During the shoot, a live video feed, as well as positional data from the Arraiy tracking system, were fed directly into Unreal Engine. The Camaro was then rendered and composited seamlessly into the footage in real-time, allowing the director and DOP to instantly see the final look and composition of each shot.

The same real-time technology was used to create, alter and produce the short film. The ability to create ‘final pixels’ in real time will ultimately change the way filmmakers create content and make critical decisions.

A live recreation of the AR visualization was also shown during Epic’s presentation, offering an up-close view of the real-time tech used to produce the project.

Wait what?

The hardest thing to grasp about the GDC demo is what one is actually seeing:  there are four major components or implications

  1. The car TVC was being RENDERED LIVE in real time. The digital car that replaced the Blackbird was rendered into the scene, composited with effects and the played in real time. This was not a video or quicktime, it was LIVE ‘game play’. Unlike a game one can not fly around the car in real time, only because the background was a filmed plate. But this itself is very significant. The background plate was a professional video layer, this special version of UE4 handles multiple streams of uncompressed EXR images (1.3GBps)in game. The EPIC engine was shown handling a fully professional compositing pipeline in real-time.
  2. The whole thing works live on set. The significance is that virtual set technology up until now worked without local lighting. In other words, vfx people might have been happy to see a virtual car on a set in the video split, but that game engine car overlay only responded to camera tracking – not the actual set lighting. Any DOP is therefore limited, as they can not see a lit car on their set responding in realtime to their lighting decisions. Now they can. The Mill Cyclops system is a general purpose virtual production tool that means if the DOP moves a light then the virtual elements – a car or a character, are immediately also responding to that light change, all in real time.
  3. The system works on set with high speed cars namely the Blackbird. The on board systems of the Blackbird feed a video signal of a 360, Lat-Long high dynamic range image to the base station. This means the virtual car is seen lit by Image Based Lighting (IBL). You can watch in real time in the chase car that films the Blackbird – wirelessly – at high quality.
  4. Epic have advanced their Engine to make it an incredible tool for virtual production. The Unreal Engine is not only producing the car to match streaming live tracking data but it is providing
    • Reflection mapping a complex detailed CG car thanks to multiple streams of uncompressed EXR images running through the UE4 Engine.
    • Dynamic live IBL lighting (with special HDR tools to provide correct high-light ‘bings’)
    • Edge light wrap as part of a multi-element compositing pipeline
    • A full high resolution professional compositing pipeline with correct alpha and mattes
    • Depth of Field
    • Light Blooms with FFT convolutions (Fast Fourier Transforms for blooms and car pings)
    • Ambient occlusion
    • A new particle system:  prototype next generation Niagara particles and FX
    • Grain and noise
    • Motion blur
    • PCSS shadows with directional blur settings
    • Compatibility with Nvidia Quadro graphics card
    • Support for Google Tango-enabled devices

 

What Does this Mean for Post?

It is tempting when watching this to say Post – who needs post anymore? In reality, while the spot could be finished on location, many clients will still want to edit, review and polish any spot. The great thing about The Mill’s Epic pipeline is that all this data is captured for the highest quality use. The reflection map that is streamed live is also recorded for even higher quality use if needed, the elements can be composited live on set, or just as easily re composited back at The Mill’s offices.

There is nothing about this demo that is baked in and restricting the client. No key data is thrown away or compressed, but the on set experience is vastly improved.

It is worth noting that this is the ‘Superbowl of Virtual Cinematography’ as it stands today. Much like the real Superbowl, the players here are trained professionals. While all these new EPIC UE4 features will be provided as part of the standard Unreal Engine, they are not there today. This project was done to raise the bar and many new technologies had to be developed to bring it to life.

Similarly, while the Blackbird exists today, there is only one in the world and the team at The Mill has been working on this project on many fronts. In addition to the brilliant innovation in the Blackbird, the Cyclops system is being developed as a major general purpose tool for virtual production, but it is completely proprietary.

From Chevrolet stand point the GDC presentation offers an insight into future car marketing. As manufacturing becomes more customised and marketing more direct, it maybe possible to soon order a configuration of a car from Chevrolet and see your car in the context of a TVC while still standing in the dealership, or  even in your own living room.

 

How they did it: The BlackBird

The Blackbird is a reconfigurable car which can ‘double’ for any digital car. The system is powerful for three primary reasons

  1. It is designed to take any wheels, so the correct wheels can be fitted and thus used in the final shot.
  2. The car can be adjusted in wheelbase length. Thus it can match a small or large car perfectly.
  3. The car is a camera car as well, capturing data and imagery that can be used to help light a CG which will sit over the top of the Blackbird.

Kim Libreri, CTO, Epic Games, remembers when he first saw the Blackbird, he was stunned at the foresight and long term planning of The Mill in making such a useful tool. “I said to them – I can’t believe it! This is crazy! Does your management know you have done this? They are letting you build an electric car?… I was so excited and so proud of how out there they were in doing this .. it is just so inspiring”, he comments. Libreri was immediately struck by how the Unreal Engine could help allow directors see a virtual car while filming the background plates. This was 8 months ago, and he almost immediately suggested a GDC project.

Fxguide visited the Blackbird in LA. One of the most impressive features is the adjustable wheel base. This allowed the Blackbird to mimic the exact wheel base of the Camaro. The fully adjustable electric servo mechanisms allow the Blackbird to adjust its length and width within a 10th of an inch to match any wheel base and track.

The Blackbird has a 4 foot variance for wheelbase (length) and 10 inch variance for wheel track (width), this covers from small hatchbacks such as a Chevy Sonic or Mini-vans and even 4 x 4 cross-over vehicles. It was just a few minutes to adapt the Blackbird to the wheelbase specification of the 2017 Chevrolet Camaro ZL1 or the Chevrolet FNR autonomous concept car.

Its modular design accurately mimics countless form factors, standing in for any car, existing or conceptual. Its powerful drive easily dishes out classic running shots and high performance precision. This makes the Blackbird a revolutionary VFX tool. The Blackbird becomes the heart of the Mill’s proprietary approach to digitally recreating any imaginable car, in all trim permutations, with never before seen accuracy and quality.

carview

The Blackbird is designed to mimic the dimensions and driving characteristics of any real car chassis. It captures accurate reflection and environmental data allowing the virtual re-skinning of almost any car in CG over the original Blackbird’s modular frame.

On-board pneumatic jacks allow for quick pit-stop style replacement of wheels and rims from all manufacturers. The Blackbird is the same weight as an average car. It’s fully adjustable suspension alters ride height, rigidity and dampening, replicating the drive of almost any vehicle. The Blackbird is electric, making it green and versatile. It is powered by a motor which digitally records its movements and is programmable to match the acceleration curve and gearing shifts of any car.

carflex

While CG cars are commonplace, there are enormous advantages in having an actual car on location in terms of realism, interaction, tire marks, dust and actual motion captured reference. The Blackbird can be shot at any time, in any location. The car also generates modular assets. It allows for a singular creative vision to be extrapolated across multiple platforms, products and markets without sacrificing any quality.

Fxguide visited the home of the Blackbird at JemFX.

The Blackbird was built by the Californian premier rig specialist JemFx in the very same hanger that the BLACKBIRD SR-71 supersonic jet was once manufactured. The Mill’s rig’s name is a nod to this legacy of stealth design. The Blackbird captures the best footage and data possible due to its camera array and tailor-made stabilisation unit, engineered by Performance Filmworks’ multi award winning technician, Lev Yevstratov (co-inventor of the Ultimate Arm gyro-stabilized camera crane car rig). The cameras and technology mounted on the Blackbird have been painstakingly realised, from concept to build, with the help of an elite team of the best camera technicians and experts.

vr

The car is powered by an electric motor capable of around 80MPH and covering approximately 120 miles on an overnight charge. The Blackbird can be charged during breaks in filming and also has regenerative braking that adds to battery life. As the Electric motor has great torque, the Blackbird can accelerate quicker than most petrol cars making it adaptive to standing in for high performance cars such as the Chevrolet Camaro.

Fxguide’s Mike Seymour in the Blackbird, on location

The car is simple to drive but for maximum efficiency, it should have a precision driver. The Mill have a list of suggested drivers with the Blackbird experience. Fxguide’s Mike Seymour is not one of them! (See right: -actually Mike is too tall and too big to drive the Blackbird comfortably, the car is designed for smaller, fitter professional drivers).

There is only one Blackbird in the world, but as Alistair Thompson, The Mill Executive Vice President, International, and the guy leading the Blackbird team responsible for establishing new creative partnerships explained, “we are working now on Blackbird 2”. The plan is to look at having one Blackbird system in the USA and a second permanently located in Europe, most likely London.

Thompson points out that the whole Chevrolet GDC project opens up a wide range of potential future applications for this technology. Cars are increasingly made to order and this technology allows consumers tremendous opportunities to see their actual configuration in real time. As part of the GDC demo the team used a Google Tango-enabled Lenovo Phab 2 Pro to view interactively the digital car. The Mill in cooperation with Chevrolet built a test ‘car configurator program that can control the Unreal Engine wirelessly. The car configurator was made written by Alan Willard.

Find New Roads is more than just a tagline at Chevrolet. We embrace that mission in the advanced engineering of our cars, and also in the way we service and communicate with our customers,” said Sam Russell, General Director of Global Chevrolet Marketing. “’The Human Race’ brings a lot of those ideals together. The technology involved in producing this film provides a glimpse into the future of customer engagement and could play a unique role in how we showcase car model options with interactive technologies like AR and VR.”



How they did it: The CG Capture

The interactivity of configuring your own vehicle on set in 3D is the holy grail of automotive advertising. Shooting with the Blackbird, Directors can view the digital car assets generated on set in real time via The Mill’s, Cyclops software. This allows the client to choose the color, trim and model of the car in real time and for the DOP to frame up on the ‘digital’ car with the correct lighting and reflections visible immediately on the split.

While Cyclops is The Mills’s general purpose virtual production tool, it has been used in this GDC demo to provide the best on-set visual reference for their clients. The Mill have built an AR application in conjunction with EPIC games that transforms the Blackbird car into a live and correctly lit version of the target car. This application works by tracking the movement of the Blackbird in real-time and generating a high quality CG model that incorporates the actual reflections captured by the Blackbird car.

A set of four 6K RED Dragon Cameras are mounted on the Blackbird, each with a 8-15mm lens. The three images are stitched into a 360 view of the world using the company’s  “Mil Stitch” program. This worldview is transmitted live to the host UE4 system wirelessly in the Edge Camera Car. Deanan DaSilva and Jim Geduldick of Wairual Labs helped consulting on the cinematography pipeline. The car worldview is a Lat-Long vision stream that provides the key information the system need to have the car look real. This image is vital as it provides the reflection map that correctly reflects the world in the car’s paintwork, but more importantly,  it is the source for the UE4 Image Based Lighting (IBL).

 

The key to the workflow of ‘Mill Stitch’ is that while it is being used at GDC to send an image to the UE4 engine, it can also be used later in Postproduction should additional shots be wanted, for example the Mill stitch software provides a source for a perfect driver POV.  Blended with a CG render of the interior of the car, it is possible to feel like the viewer is inside the vehicle and taking part in the shot.

Mill Stitch is able on the Blackbird to:

  • Live video ingest, lens de-warp and stitch across multiple cameras
  • Ability to scale future camera and lensing configurations
  • Ability to accept external rotational and positional data to translate and composite on-set pre-visualization
  • Latency: 8-14 frame delay from actual performance to stitched interactive output
  • Storage: Recorded four day rehearsal and performance video capture and automation
  • System uptime: 100% at 96 hrs runtime
  • Real-time interactive output and automation replay saved at the very least double the amount of runtime worth of processing per shoot day.

How they did it: The UE4 & the Camaro ZL1

At GDC when running live in front of the audience, the Mill Stitch only stitches the world view does not need to do any other immediate processing. All the rest of the processing is done in a reasonably high end Nvidia PC (2 procs, 20 cores).

The dynamic range of any stitched set of RED images will not have the full dynamic range needed for full HDR based IBL. The team did a clever trick to process the LDR Lat-Long image (which is quite wide in dynamic range from the Dragon sensors) and convert any clipped white values higher. Any clipped white in the Lat-Long was probably the Sun. This produced a very close approximation to a bracketed HDR.  In addition to the Lat-Long IBL information the process also needs real-time tracking. The Mill have been tracking cars successfully for many years, but for this virtual production model at GDC they needed a new real time system that was extremely accurate and very fast.  The team turned to a new startup, still in stealth mode : Arraiy. Formed by Ethan Rublee and computer vision legend Dr Gary Bradski, the new startup is working on advanced tools for the visual effects industry.  Part of Arraiy new endeavour is advanced SLAM or Simultaneous Localization And Mapping. SLAM is the computational problem of constructing or mapping an unknown environment. The team at Arraiy were happy to help The Mill out with a computer vision tracking solution that would provide both the camera position and the position of the Blackbird. Unlike other solutions “when we tested it with a 5D mkIII we got submillimeter accuracy in real time” explained Rublee CEO of the Palo Alto Startup . In commenting on the real time tracking system during testing, Kim Libreri, stated “it is the team behind OpenCV.. and the tracking: I have never seen dynamic tracking as good – loads of people talk about it but… This worked with crazy motion blur, sun highlights.. it was great”. Vince Baertsoen, Group CG director of The Mill agrees, “It was great working with Gary, he is a rockstar in the computer vision world”.  The Mill has been working with Arraiy almost since the company started, “we have been working with them now for almost 6 months,.. and they provided this really strong monocular, – single camera,  solve” he adds.

To make sure the system would track reliability at high speed car to car, even with motion blur and outside uncontrolled lighting, Arraiy decided to add computer vision ‘fiducial markers’ makers to the Blackbird. “We wanted to make sure we could always track it, most of our work doesn’t assume special makers but we had the chance to use them here and it made the system really reliable”. While it is known that Arraiy is using new advanced deep learning neural network processes for some of their R&D work, “for this we took a more traditional computer vision approach”, Rublee said “we were really happy to help out, it is a great project”.

The data streams from The hero Arri Alexi on the camera car, into the Arraiy solver into the Unreal Engine.

The system integration and tracking at the Mill was supervised by Eric Renaud-Houde.

Epic has been building functionality into Sequencer over the last few years, turning it into a very powerful production tool, thanks to the work of Michael Gay, Max Chen & Max Preussner at Epic. This new version is now able to deal with OpenEXR files and will be released as part of a rollout of all the new Epic technology throughout the year. 

The Blackbird is a VFX dream.

The film required extensive Look Dev which was done at Epic by Francois Antoine, Minjie Wu and Min Oh.


The Blackbird has been more capabilities than was needed for the GDC demo, for example, the Blackbird’s tailor-made camera stabilisation unit, as well as providing an adaptable head (designed to take all the leading cameras), is also built to incorporate a laser scanner. This scanner is similar to those used by autonomous, self driving vehicles. It would allow the system to three dimensionally record and replicate the terrain it is driving through. This could be invaluable for re-creating complex CGI environments.

In the live rendered short film The Human Race, the start of the film is a CG car rendered in real time over a high quality photographed plate. This footage was filmed from the camera car on an Arri Alexa. Once the car’s enter the tunnel however, the cars and the environment are both CG rendered in realtime by the Unreal Engine.

The lighting seen, especially in the tunnel, was handled by Andrew Harris. The rendering in the film has seen many core advances and was spearheaded by Marcus Wassmer, Brian Karis and Guillaume Abadie. “Brian’s magical image based lighting code,.. is very accurate, the metals look right, the illumination looks right, the diffuse looks right, and Brian made it so we could do diffuse and specular from the live sky light (IBL) Lat-Long” comments Libreri. “You now have chromatic aberration, grain and noise working in the compositing. We have edge wrap, plus FFT Blooms, so you can shoot a kernel on an SLR, say a flare, load it into the engine and it will work as a convolution kernel, so you get perfect photographic blooms.” he adds.  “We also have Screen space reflections between the layers and Brian added ambient occlusion and reflection occlusion so the techniques pioneered by the guys back at ILM are now effectively inside Unreal Engine”.

When compositing over live action plate photography the camera position is determined by the Arraiy tracking solution, but once inside the tunnel the camera is fully controlled by the artist. This means you could fly the camera in realtime around the action in the tunnel and view it from any angle, all interactively at full quality.

The film also has a new version of Epic’s particle system that will soon be added to the Unreal Engine. “We wanted to test the Niagara particle system in a real job and see how it runs, but it is not quite ready to be rolled out just yet”, commented Libreri.

How they did it: Cyclops

Mill Cyclops is a proprietary virtual production toolkit developed by The Mill that enables us to render and integrate digital assets into film and augmented reality.

One of the key leaders of the Cyclops and Blackbird project is Boo Wong, Group Director of Emerging Technology for The Mill.

Wong has been the forefront of The Mill’s Immersive and Interactive teams who specialize in immersive experiences and drive The Mill to explore new technology. She is one of The Mill’s pioneering team responsible for launching Mill Lab, the area where Mill artists make, break, and experiment. Boo Wong was The Mill executive producer on The Human Race.

She has been key at exploring ways to deploy the Mill’s new technology including Cyclops. Boo Wong was responsible for connecting The Mill with Arraiy, and having them partner on the project.

Wong worked closely with Vince Baertsoen, Group CG director, who has been key with the Blackbird from its inception. “In the future we hope to incorporate more data, we have the Lidar scanning working, but we are only currently using that in an offline mode, we also have GPS data, which we are currently working on for the next stage to add a much more advanced set of data that we can send to the real time (Cyclops) solve”,  Baertsoen explains. Shadow casting is a complex problem and this is being further worked on by the team.

As a filmmaking tool, Cyclops blurs the lines of production and post. Directors are now able to work with finished quality photo-real digital assets, live on location.   It will also mean viewers/audience in the future can affect change in films in ways previously unimagined whilst watching, giving them control over objects, characters and environments. This hybridization of film and gaming ushers in a new era of possibilities in creative storytelling – films you can play.

Cyclops has a wide range of future applications including; –

  • On-set previz –
  • Real time VFX (CG and compositing) –
  • Interactive films –
  • Interactive VR –
  • Augmented and mixed reality –
  • Product visualization

Cyclops runs on the Epic Unreal Engine.

 

1 thought on “The Mill’s incredible Blackbird at GDC : The future of virtual production”

Comments are closed.