Get to the room early – it’s something you need to be aware of at FMX. On Day 3, presentations by ILM and Double Negative and many others were full-houses, often 30 minutes before they began. But of course lining up is worth it for exclusive info and breakdowns, and in the case of ILM, a realtime rendered demo of an iPad controlled Star Wars scene. Here’s our wrap-up of day 3 at FMX in Stuttgart.
The Force of Virtual Production at ILM
ILM Visual Effects Supervisor & Head of New Media Rob Bredow gave a very excited audience a real treat – an insider’s view on what ILM and Lucasfilm have been doing in the realtime space, including a mind-blowing Star Wars-themed realtime iPad controlled demo experience that was shown for the first time outside of Lucasfilm.
First up we saw a hint of ILM’s collaboration with Felix & Paul for the Jurassic World VR experience, plus an outline of how virtual production techniques have been pervading much of ILM’s filmmaking in recent years. This, said, Bredow has been furthered by the Advanced Development Group at Lucasfilm – who are ‘transforming entertainment with realtime graphics’.
One of their initiatives is realtime lookdev in MARI utilizing a ‘Shared Asset spec’ that allows the film and games and interactive parts of the company to create layered assets – quickly – and share them. The process is entirely procedural, so after an artist has built up layers you can see the realtime changes. The example we saw being built was a CG speederbike.
Next up was V-scout, ILM’s virtual set scouting tool. “What if you could scout your sets before you built them?” explained Bredow. Using a simple interface on an iPad, this tool is designed for cinematographers or art directors or anyone on a film to consider locations or digital assets for review. It’s able to take in models from SketchUp or other programs, as well as real locations and then explore them via the iPad. It also works with VR glasses. On the iPad a user taps or drags to consider the sets, play with real lens settings, depth of field etc. Bredow says V-scout was shown to a big-name director via VR goggles, who at first was interested in the VR but then within 30 seconds began discussing the asset – a ship – as if was a real creation without mentioning the VR at all. “It looks so much better in real life,” Bredow says the director remarked, even though he was looking at a virtual creation.
V-scout lets you bookmark sections of the virtual location, grab thumbnails, email production and spread notes around. Bredow says it is likely to help decide what to build practically versus digitally. Currently it needs a computer and wireless network to run – units have already been distributed around the world on ILM shows.
Then, the demo that wow’d the crowd – a live realtime rendered Star Wars Tatoonie-esque scene featuring C-3PO, R2-D2, Stormtroopers and Boba Fett. As the scene played out, Bredow used an iPad to control views – he could pause the scene any time and look around anywhere. He could zoom in and out as the scene played, make different camera moves and go slow mo. He could follow say one Stormtrooper from any height and even enter their helmet for a POV shot. All while displaying highly photorealistic (albeit not final) textures.
Remember, this was rendering in realtime. Whilst it was not rendering directly on the iPad – a different local PC was doing the heavy lifting and streaming to the device – Bredow says maybe this might be possible in the near future. The demo uses a customized version of the Frostbite gaming engine – ILM uses multiple engines for this kind of work.
As the demo continued, Bredow explored previously unseen parts of the scene, such as the action taking place inside a house with 3PO, R2 and a Princess Leia hologram. Further controls allowed Bredow to change the weather by adding a dust storm. There’s a master story built into the animation but some tweaks can currently be made. One such tweak ILM had incorporated into the scene was being able to enter an X-Wing cockpit and use a button as a trigger to allow you to fire lasers. Very cool.
At one point, Bredow posited, “Imagine a future where a scene cut from a film is used in this way. We could leverage some of the deleted scenes and maybe experience them in a completely new way.” One of those ways might be as a VR experience, which ILM has already tested. We saw how they hooked the scene up to VR goggles and had a user piloting a craft through the scene. “It turns out that’s super fun to do,” said Bredow. It certainly looked it.
40 Years of Industrial Light & Magic
Following Rob Bredow’s demo was a huge presentation celebrating 40 years of ILM featuring CCO and VFX supe John Knoll, former ILM Modelshop supervisor Lorne Peterson and VFX supe Richard Bluff.
It’s near-impossible to relate all the incredible clips, images and VFX breakdowns the ILM panel showed, but suffice to say it was jaw-dropping. We also got the low down from Knoll on why George Lucas wanted to start ILM in the first place – moving the camera to create dynamic space battles for Star Wars, something that had not really been possible with miniatures in previous films and TV shows. John Dykstra’s original computer controlled motion control camera rig – made for a Marin County film – was shown in some footage. This is what was adapted to help make Star Wars’ ground breaking VFX, solving movement, depth of field and model detail.
Peterson recounted his entry into ILM when he was called upon to work on the Death Star miniature and had a radical technique for speeding up production. Previously the model team was using epoxy glue that took time to dry. Peterson came into the shop with superglue instead. He put a dot of glue on a pencil and placed it on the edge of a table – where it stuck. When ILM crew members saw that result, there was an audible gasp. “That changed everything,” said Peterson.
The modelmaker was also called upon to be an extra appearing in the scout tower at the rebel base in Episode IV – a scene shot in Guatemala. “It was a bit precarious up there and I didn’t have any children so they chose me to climb in.” Peterson’s other classic Star Wars story was the search for reference for Tauntan-like fur and skin for Empire Strikes Back. “We tried to find the hair of unborn cat skin – some people thought we were on a demonic trip!”
Knoll related stories about ILM’s foray into computer graphics and digital compositing (“The greatest thing to happen in visual effects,” said Knoll.). There were literally cheers in the audience when Knoll put up a title slide that read “Terminator 2”. This of course was a landmark film for ILM, following on the heels of The Abyss and ushering in further CG and compositing work used on Jurassic Park. Perhaps one of the most remarkable but not often discussed visual effects techniques used on T2 was camera projection, including for the shot of the T-1000 going through the metal bars. Camera projections had also been used for matte paintings in Hook, then on Mission: Impossible and in a big way in The Phantom Menace.
Bluff showed some remarkable VFX breakdowns evidencing ILM’s seamless and invisible environment work from Mission: Impossible 3, Avatar, Transformers, The Avengers and The Lone Ranger. On Avatar, Bluff sought out a new solution for digital foliage for the huge opening shot across Pandora’s trees – this was SpeedTree, previously only a gaming foliage generator. Since then, ILM has sought out innovative and manageable solutions for delivering and augmenting environments.
Interstellar – Double Negative
Academy Award winning visual effects supervisor Paul Franklin from DNeg gave a spectacular talk on the practical / miniature / special / visual effects in Interstellar – and on their scientific and practical basis.
He began by distinguishing between special effects and visual effects:
– Special effects are made by big burly men who drink beer and like blowing stuff up!
– Visual effects equals…lattes.
Interstellar was a real exercise in ‘how much can we get in camera’, said Franklin. And of course a lot was, often then brought together in comp by DNeg and aided by wormhole/blackhole renders, digital water sims and environment augmentations.
Interestingly, director Chris Nolan wanted to go for the messiness of space. Example images of Buzz Aldrin selfies which showed blown out backgrounds and harsh sunlight were used directly by DNeg as reference. And by New Deal Studios in photographing its miniatures, too.
Even the practical effects preserved that gritty feel. A full-sized Ranger ship was taken to Iceland and filmed, and then when it was brought back to a studio in LA it was left deliberately dirty – in some cases you could see the boot marks from the grips.
We saw breakdowns of the water world, how TARS and CASE were puppeteered (“making a stepladder dance”) and done with CG too, and how Icelandic environments were in some cases only minimally augmented to be Mann’s planet. One of the coolest behind the scenes videos was the IMAX plate shot from the nose cone of Lear jet of storm clouds in Louisiana – these were directly featured in the film and just added to with subtle dirt textures. Another, of course, was the tesseract sequence – a practical set and DNeg design and compositing solution that evoked slitscan lines and Eistein’s theories of every object in the universe leaving a trail of matter.
Light Fields for Virtual Reality – Paul Debevec
VR talks continued today, and one to watch was from USC ICT’s chief visual officer Paul Debevec. Here he looked at how light fields have been used to capture real world environments to produce photorealistc navigable VR experiences. Debevec showed OTOY’s solution which you can see in the video below. We will certainly be showing more of this on fxguide and fxphd in the coming weeks.