Jump straight to a section
The Terminator and Terminator 2: Judgment Day will maybe always go down in filmmaking lore as pioneering a new wave of both practical and digital visual effects techniques. The newest addition to the franchise, Alan Taylor’s Terminator: Genisys, pays homage to the storylines of those two films – in several clever nods. Repeating the groundbreaking visuals of the earlier films was always going to be a hard act to follow, but that task was made somewhat easier by the incredible advancements in effects technologies since the 80s and 90s, from crafting photorealistic human faces and skin, to liquid metal fluid sims and complicated compositing.
fxguide explores a sampling of the film’s practical and digital effects, overseen by visual effects supervisor Janek Sirrs, as we cover the work by MPC, Double Negative, ILM, One of Us, Method Studios and Legacy Effects.
See how MPC crafted a digital Arnie for Terminator: Genisys in this video made with our media partners at WIRED.
San Francisco suffers
On screen: The film opens as ‘Judgment Day’ nuclear missiles hit San Francisco, causing widespread destruction to the city.
VFX: ILM handled the sequence, under visual effects supervisor Grady Cofer. “It harks back to the original T1 and T2,” says Cofer. “Our sequence is kind of a re-telling of Judgment Day from T2, which is one of the great iconic scenes in VFX history. We experience that not in LA, like in the dream sequence in T2, but in San Francisco.”
The shots begin at a desert ICBM launch facility. “Janek Sirrs had shot some plates in the desert,” recounts Cofer, “and we animated the missiles in this two stage launch where it lifts up out of the silo and another thruster kicks in and they fly through the sky. We also witness the missiles from a commercial airline and this little boy looking out the window. You start seeing these explosions out on the horizon. The idea there was we didn’t want to step on the explosion shots to come – we wanted things to build. It was like those ominous thunderstorms when you fly coast to coast in the US you end up flying over a lot of storms and you can look down through this layer of clouds and see all these glows.”
The action then moves to Dolores Park in San Francisco where a blast happens nearby, as parts of the city are engulfed by the explosions. “Janek shot on a beautiful clear day in San Francisco and shot some plates from Dolores Park,” says Cofer. “He also filmed another shot looking up Columbus Street looking up towards the Transamerica Building, and another one in the financial district looking up California Street. The idea was that the ignition blast happens a little bit above the ground, so instead of the traditional mushroom you get this sphere explosion, then all of the force that goes down starts traveling outwards in a platter. As the platter is pushed through different streets it would turn into these fingers and start destroying buildings and flipping cars along the way.”
Those same locations were scanned with a FARO LIDAR scanner to produce around 12 spherical captures, which were combined with HDRI captures for aligned photography. ILM then modeled buildings and structures inside its Zeno platform. “We wired up everything for destruction and made sure it had a sense of interior frameworks so that there was a sense of resistance,” describes Cofer. “The ones close to the blast we almost incinerated. They were pulverized. As the blast irises outwards down the street, the destruction level falls off and we have signs and glass breaking off and roofs lifting off.”
In order to accurately portray the extent of the devastation, Sirrs directed ILM to a website called NUKEMAP. “It’s like Google Maps for nukes,” says Cofer. “You type in a location and a kiloton yield, how powerful you want the nuke to be, and it produces these concentric radii representing all the different levels of destruction around an actual nuclear blast. On the outside you have what essentially turns into a huge fireball and then outside of that you get this pressure wave radii where things get blown from the pressure change and then outside of that there’s heat that extends very far.”
ILM was also called upon to create Terminator vision shots for the film. “When you look back at T2 it was pretty utilitarian, militaristic – white text on red footage, a no frills representation of machine analysis,” says Cofer. “But it was very effective. The thought on this one was that we would stay in that spirit. We kept to a pretty rectangular design, not a lot of curves, and established this language of scanning and targeting that would help guide the audience’s eye to all the pertinent information in the frame. So we wanted to reference back to T2. But it was also important that we modernize certain aspects of it.”
“A lot was 2D After Effects graphic design – but they did want to update the scanning ability of the terminators. In the originals it would do a scan as a drawn outline. In our modernized version we almost did an MRI – we’d match-animate or get the animation and push a scan through the interior geometry and show a sparse set of particles as we push through it. So we get a point cloud volume representation. As it would push through that person, once it got half way through, it would ping an outline – a moment of recognition. Once it ping’d there would be a call out box where it would start scanning and you would see it doing some kind of match. That was a very 3D component aspect of a more 2D graphic design.”
“From that,” adds Cofer, “we measured a 350 kiloton explosion in Downtown San Francisco – it worked out that it doesn’t quite get to across the bay, so wouldn’t necessarily destroy the Golden Gate Bridge but would do a lot of damage in the city proper. What it allowed for was an additional shot just of the Golden Gate Bridge – it settles for a moment and you think all the destruction has past, but then all of a sudden there’s an explosion right above the bay which incinerates all the water and sends a tidal wave out that destroys the bridge – that’s the finale on the Judgment Day sequence.”
The blasts themselves were realized with a combination of ILM’s proprietary Plume toolset and Houdini. “We’ve expanded Plume’s abilities and given it a lot more RAM to play with, so we can do very high resolution simulations relatively quickly,” explains Cofer. “We used it both for the nuclear explosion itself and also a lot of secondaries. We wanted a sense that all of the nuke facing surfaces were being incinerated – so anything like paint or dust immediately turns to smoke and starts wrapping around the buildings.”
ILM made reference, in particular, to the Judgement Day sequence in T2 in choreographing the blasts. “One of the things that always struck me about that sequence was the difference in speeds,” says Cofer. “In the wides you get this slow growth, and then when you cut in and they did those great table top miniature shots, when they turned on the air canons things just blasted and you almost saw a soft body reaction to the buildings. We wanted to re-create the energy of that – there was a sense of things happening in realtime, that was kind of shocking.”
For the tidal wave shot, ILM produced an explosion above the San Francisco Bay that causes a secondary reaction of water beneath the Golden Gate, which hits the foreground cliffs and then destroys the bridge. “We did the Golden Gate sim in Zeno,” says Cofer, “where we could couple a bunch of physics sims together in one simulation. We were doing rigids, flesh sims as everything wiggled, then all of the cable sims – all of that was working together in a multi-physics way. We already did have a Golden Gate Bridge model we could use and we modernized it a little bit for our pipeline and rigged it for cable destruction. It’s interesting how many times the Bridge has been destroyed – it definitely seems to be ripe for destroying in the movies!”
“When we attacked the destruction shots,” adds Cofer, “we did hope to render as much together at the same time as possible. Once we’d done all of the simulations for the buildings, the cars, the street – we brought all of that into Max and rendered it altogether at one time in V-Ray with one V-Ray sun. It really unifies the things – reflections and shadows – you get a lot by doing it all in one place. The trick was how much geometry we were pushing through those tools. Once we had done our destruction sims, we were approaching about a billion polys from cars and fractured buildings.”
Back to the top.
A future war
On screen: In 2029, John Connor (Jason Clarke) and Kyle Reese (Jai Courtney) lead an assault on the location of Skynet’s time displacement device.
VFX: MPC helped create an array of Skynet robots and vehicles that resist the assault, including shots featuring Hunter-Killers, Spider-Tanks, and Endoskeletons based on reference and a practical version from Legacy Effects.
“The future battle was an epic scene,” says MPC visual effects supervisor Sheldon Stopsack. “It entailed an extensive amount of crowd work on the ground and in the sky. We had the troops invading and being fought back by Endoskeletons fighting each other, and we also introduced air to ground missiles taking things out.”
Principal photography in New Orleans featured practical sets that were extended by MPC. Legacy Effects also provided Endoskeleton reference on location; returning to the world of the Terminator was a significant moment for Legacy’s John Rosengrant, who had worked with Stan Winston on the first two films. “The production had an Endo design of where they wanted to go,” he says. “I thought it was a very nice update which stayed very faithful to it, but there were some new nuances that were clean innovations. We took those digital files and made them watertight and broke them down for making a real one. There were well over 300 parts and got them all ready for rapid prototyping. We printed out all the parts, cleaned them up, molded them and then made three Endoskeletons. One could be broken apart into various pieces for digital reference, one used for stunt type work.”
Legacy Effects took advantage of major developments in crafting on-screen materials the studio had made from working on films such as Iron Man and its sequels. “We had materials I wish we would have had back in 1983 And 1990!,” exclaims Rosengrant. “We’ve been developing with different formulas of urethanes that are flexible, and paint systems we’ve been working with that replicate metal on rubber surfaces. They were mostly made of different types of urethane, with some fiberglass parts. It was a paint treatment. In the first two movies had a vacuum metalizing process which is a spray that goes on plastic, but once it starts to rub off you’re screwed as there’s nothing you can do about it. Then we moved into actually chrome plating things. But then the downside of that is that it makes everything kind of heavy. Here we had a paint treatment that looks like chrome. It’s very durable, but if something happens to it you can touch it up.”
“The Legacy Endo,” adds Stopsack, “also held enough detail and accuracy for us to use as a guide when it came to the build of the digital version made at MPC. We took extensive photographs and approached it as it was a classical digital character.”
To animate the digital Endo’s, MPC carried out a number of motion studies. “The question really becomes,” notes Stopsack, “what is it you are going for? A raw mechanical machine? Or a machine that is built to mimic the human behaviour. In the end we settled on something that was in-between the two. Going to humanoid felt natural for such a menacing creature. But going robotic and machine like didn’t work either, as it didn’t leave us with a character people could ‘relate’ to, or I should say, people could fear.”
The attacking troops face a daunting challenge against the powerful Hunter-Killers and the Spider-Tanks, which are actually dropped by the HKs. “We explored various options on how to fold the Spider-Tanks, but also fit them into the body of the HKs,” says Stopsack. “With both machines being a significant size, we wanted to avoid the sense of the HK being an empty hull once the Spider-Tank got deployed. With the design being based of the original Terminator movies, the material finish had to be rather slick. The concept stage determined a sort of brushed metal look. The challenge there is to maintain the sense of scale. A fairly uniform finish quickly turns the perception into something small. We pushed for a fairly high frequency level for the ‘brushed metal’ for a start. The difficulty there is to maintain this detail within the shot context when the HKs or Spider-Tanks are seen further away from camera. We combined the general material with a detailed level of imperfection. Mud, grime, scratches, all of these were textural features that helped to sell the scale the menacing nature we wanted to go for.”
MPC’s in-house crowd tool, ALICE, served as the means to fill out the future war scenes on the ground and in the sky. “Crowd agents interacting on the ground level is something that ALICE was originally designed for,” states Stopsack. “Recent movies like Guardians of the Galaxy required us to prepare ALICE to deal with epic simulations in the sky. So we were more or less covered from a technological point of view. One difference is that we required a larger interaction between the two groups, which meant we had to spend a little bit of time extending the systems even further. Helicopters had to take out groups of Endo troops, while those interacted with other troops, all of this while HKs are attacking the helicopter.”
Back to the top.
Inside the TDD
On screen: John and Kyle make it to the Time Displacement Device (TDD), but having witnessed Skynet send a T-800 back to 1984, Kyle volunteers to also travel through time to save Sarah Connor.
VFX: Double Negative crafted the TDD environment and the actual time travel effect for this future sequence, as well as incarnations of the device for 1984 and 2017. “The idea behind it is a blend between organic and inorganic,” describes DNeg visual effects supervisor Pete Bebb. “We wanted to show the power of what it takes to turn on one of these TDD devices, and bear in mind you have to keep a living entity within it without getting completely destroyed. So we had almost a ‘birth of a star’ feel to it. A lot of arc’ing electric, solar flares but similar palette to the original with cyan blue feel.”
A practical TDD set-piece with rotating arms was enhanced and extended digitally as the machine gets up to speed. “Originally the premise was to have it incredibly violent and have huge arcs and lightning hitting off the side of the sphere,” says Bebb, “but inside it’s quite protected and serene. We needed to up it otherwise it wouldn’t feel like a painful device to go through.”
DNeg handled views of the time travel sphere arriving. For the later 2017 arrival, which occurs in the middle of a San Francisco highway, the studio augmented some plate photography. “We locked off a road at night in New Orleans,” recounts Bebb. “They built the device on a crane with huge strobe lights on it, dropped it down, hovered it roughly over the street and that gave the interactive lighting. We had a rough cutout of where the eight foot sphere would be, and then we put in our CG device.”
During Kyle’s travel through the TDD, he witnesses two timelines, one a happier journey in which he revisits alternate memories as a child, and then a more post-apocalyptic vision. One of Us treated footage to realize the disparate timeline views. “Most of the shots takes place in a flashback,” says One of Us visual effects supervisor Bryan Jones. “It was warmly lit and a lot of it was shot high speed to give it a little bit more of a dreamy feel. Our brief was to not just do a color grading gag or something that could have been done in DI.”
“They wanted it analog and photographic and optical,” adds One of Us co-director Dominic Parker, “because those feel more in the realm of psychological. Otherwise it looks like you’re treating an image, rather than seeing an image through some mental process.”
To get that feeling, the studio reference HDR landscape photography and optical bleeds and artifacts. “A lot of it was an aurora that surrounds the characters,” says Jones. “And a lot of it was displacement and roto-driven, and a lot of additive style exposure things to make it feel organic, rather than just a grade. We tried to steer away from the typical light wraps and glows. We used a lot of elements we could find for the light leaks and any sort of film gak stuff to embed in there.”
Back to the top.
Nice night for a walk
On screen: The T-800 in the form of a 1984 Arnold Schwarzenegger arrives in Los Angeles, and the familiar scene from The Terminator starts playing at out at Griffith Observatory, until an older – and re-programmed – version of the T-800 as ‘The Guardian’ suddenly interferes.
VFX: This twist in the tale required significant visual effects innovation, since Schwarzenegger – now 67 – would not have been able to play his younger self. The actor’s distinctive features were also something that could not necessarily be replicated with a body double and make-up effects. That left only a digital solution, with MPC entrusted to make a CG 1984 Schwarzenegger – itself a incredibly difficult task. “If we were creating a digi-double of an actor, today we would do a texture shoot and a reference shoot with that actor,” suggests MPC’s Stopsack. “We didn’t have the luxury basically, because he had to be Arnie from 30 years ago. We can’t really plan for it in the classical sense, we had to think about it a little differently.”
“What it really came down to,” adds Stopsack, “was utilizing any reference footage or material we could get our hands on. We’d do extensive research on Arnold and his younger years in the time of 1984. We’d look for any video and imagery and of course utilizing mostly the original Terminator movie.”
That reference included, too, pieces that had been created by Stan Winston Studios for the 1984 film. “We had a a head cast that was scanned digitally,” says Stopsack. “It gave us a starting point and something to cross-reference. But what is worth mentioning is the cast was taken with digital scanning not in mind, so the accuracy from the cast and scan was limited. It gave us an idea of proportions and how lean he was but really it only gave us a basis.”
MPC launched into a photomodeling process based on footage and source material, limiting themselves to around 50 main key angles of the character from which to work. Interestingly, Stopsack notes that how artists remembered Arnold from 1984 was different to what he actually looked like then. “If you do a bit of research and look at the footage from 1984 you actually start realizing his appearance changed tremendously from year to year and month to month – it was very dependent when the footage was taken. It was determined by whether he was in competition mode or working out. We had to be conscious of that and not be a slave to every picture we found of him in the 80s.”
A ZBrush sculpt served as the base build, with the the mesh then topologized. MPC approached the modeling from the inside out. “We did sculpt the model with an anatomically correct underlayer,” describes Stopsack. “It started with the skeletal base structure. Knowing that we were dealing with such an iconic figure, we played close attention to the proportional measurements that we could find out there. We ended up with roughly one million polygons for the body, plus another 250,000 or 300,000 polygons for things like eyes and teeth and nails.”
Rigging was important too, especially to try and re-create Arnold’s specific muscles shapes. Says Stopsack: “The inside out approach was our desired methodology again, and we pushed the rigging tech to deal with volume preservation in our muscular system, but we did start deviating at some point. We pushed the approach to 80 or 90 per cent, but then came to a realization that if you do the anatomical approach – take it one step further by making everything volume preserving and start simulating skin on top of that – you actually find you can be limited. This is because you might rely too much on the simulation being 100 per cent accurate.”
“For a generic human being or human character or an ape, that would have been the right approach,” adds Stopsack. “However, with the task in mind of re-creating scenes from the original footage – which really required a pixel per pixel match almost – I think we reverted at that point. The underlying physical structure gets you that far, but maintaining a match like the 1:1 physical re-creation means you start applying correctors on top of it. So all we did was just not have to rely on the physics of the actual simulations to get you there – which would not have worked out with that level of accuracy we needed.”
Shots of the T-800 begin with significant close-ups, which meant MPC had to enable complex facial rigging and skin creation. “We fleshed out a spec of FACS shapes we wanted to have,” states Stopsack. “Those FACS shapes were basically selected with the target in mind and we would have Arnold himself to capture these in a MOVA session for us, and be able to cross-reference them. What ended up happening is that we fleshed out these FACS shapes in our targeted base library, but we really did the first round of face shapes in a more traditional manner – we did face shape modeling for the face. Arnold wasn’t available straight away for a MOVA session, and it was also for us a way of getting a first set out quickly.”
Later, Schwarzenegger would complete a MOVA session to deliver FACS poses and dialogue from the scene. “We also had him capture more fight scene orientated facial performances – more action expressions,” says Stopsack. “These were for when he’s under more tension and stressed and needs a more ‘evil’ expression.” The poses were then re-targeted to the Young Arnold model and combined with the blendshapes.
For the skin, MPC knew it could take advantage of Pixar RenderMan’s new RIS ray tracing rendering tools, developing a shader that used multi-layer scattering to represent the epidermal and subdermal layers, “discarding the more traditional dipole method we’d used in the past,” says Stopsack.
The team had also been following research done in the area of skin, particularly USC ICT’s micro-geo pore texturing. “Internally we did some data acquisition on our end with macro photography to try and extract skin detailing for various patches on the face,” recalls Stopsack, “and use that data that we acquired to make that part of our base layer. The methodology was probably more simplistic than ICT would do, but it was a good enough reference and level of detail for the character.”
A blood flow model was adopted, too, to display on the face and body. “It required a certain lag,” says Stopsack, “and that’s what we introduced into the blood flow. As the compression releases it does take a certain amount of time depending on how strong the compression was to get the blood going back into the tissue. It was dynamically handled from rig to shader – it’s probably barely noticeable, but at the level we’re trying to do it was the subtlety we had to have to make it more convincing.”
Eyes were also crafted anatomically correct. “We layered in the cornea, the iris and the connection of the meniscus and the wetness layer and how it connects to the eyelids,” explains Stopsack. “Our eye model had the refractive and caustic effect that you would see in a human eye, but that was not necessarily fully raytraced – we fell back to a more art directable approach of using a pre-computed caustic using the incident of a light. There were two reasons for that – the benefit of doing a physically accurate approach was, yes, it’s got to be true because physics dictate this is what it should be. But doing it slightly the other way round doing a pre-computated map, gives you the benefit of having a little bit more artistic freedom.”
Young Arnold has 13 different hair systems – from head hair to eyebrows and peach fuzz – something achieved using MPC’s proprietary grooming tool Furtility. “The difference on this one was that with the RIS render method and a full path tracing approach,” says Stopsack, “not only does the shader method change to follow a more physical method, it also proved that with path tracing you do have a lot less areas you can cheat the appearance of the hair. That required us from a grooming point of view to be incredibly accurate – the clumping and curlyness and loose and stray hairs had to be bang on. We ended up having a million individual curves and hairs scattered and distributed all the way around him. It also had to include a level of imperfection to bring the character to life.”
In fact, imperfections became a crucial addition in selling the believability of the character. “CG imagery tends to end you up in a perfect sterile world and it’s something we had to learn throughout the process,” admits Stopsack. “We had to go through the process of ‘fucking it up’ a little bit. We have to be brutal to make sure it wasn’t all perfect, the bumpiness or imperfections in the skin. That was an exercise in consciously stopping and realizing it was too perfect, then would break it and once it was broken we’d find a middle ground.”
During the fight, sections of the Endoskeleton on the Young Arnold T-800 are exposed. For those, MPC relied on its own Endo build and was also conscious of matching it to the practical puppet built by Legacy Effects.
Filming for the sequence took place in New Orleans mostly on a bluescreen stage. Stand-in performer Brett Azar, who had a muscular physique, acted as the Young Arnold. “It was great to have a stand-in and be able to put together an edit of the sequence they were happy with,” says Stopsack. HDRI coverage on the stage would later help with lighting, and set survey scans aided in extending the partial sets to fill out the LA Observatory environment. MPC also conducted surveys at the real location and then stitched imagery together for the backgrounds.
For the dramatic fight – in which the T-800 is finally disabled with a uranium bullet fired by Sarah – MPC would essentially replace Azar with their digital Young Arnold. Occasionally a digital double or face replacements for The Guardian were required, too. And Lola VFX handled several ‘youthification’ shots for Schwarzenegger.
Later, Kyle encounters the T-800 which has been brought back to life by the T-1000. He fires a weapon at the terminator, causing its skin to peel and flake off in a fiery manner. “We used an Arnold mesh to act as a guide for where and how to place flesh and skin,” explains Stopsack. “Once we were happy with the overall distribution and amount, we started simulating this geo. It was basically a cloth simulation from that point onwards. Different physics had to be applied to various parts. Small pieces were light enough to be picked up by the heat and turbulence caused by the fire. Larger pieces dropped and were more gravity influenced. This process required some creative adjustments. To avoid any trouble with the targeted PG-13 rating, we had to avoid the sense and look of raw pieces of meat coming of the Endoskeleton. We ended going for a stronger charcoaled look and played flakes generally a bit lighter.”
Back to the top.
On screen: During the film, The Guardian’s various levels of combat result in progressive damage.
VFX: Legacy Effects worked on make-up designs and appliances for the stages of damage and looks that Arnold Schwarzenegger goes through in the film. “My lead designer Scott Patton was instrumental in designing the various stages of make-up for the Terminators – of which there were several,” recalls Rosengrant. “The design process was fun and unique from the stand point that we had life casts through the years of Arnold and body scans through the years.”
The life casts were brought into the digital realm and combined with a CG model of the T-800 endoskeleton head. “That means we could actually import that in digitally into the scans,” says Rosengrant, “and then peel back the layers and create the looks of a typical Terminator being destroyed – which was a far cry from how I did it on T2 which was with a pencil! We had these highly detailed rendered 3D sculpted designs of the various progressions of damage that hit the major beats of what each Terminator would go through. It was a fun way of designing all of this. We had all of the various incarnations of what we have done through the years of how we have torn him apart.”
“Part of the design process,” adds Rosengrant, “was that you had the good Terminators who were heroic, so the way you tear them apart in the face was say a little different than what we would do to the 1984 Arnold which is supposed to be bad. The good ones still have to be heroic even when they’re having their flesh torn away! Whereas with the bad one when it died, we were able to take its nose off and do some things to it that you wouldn’t do to Arnold when you’re trying to keep him heroic-looking.”
“One of the challenges was that they had stunt doubles of Arnold for say motorcycle riding or otherwise,” says Rosengrant. “We ended up making these very simple stunt masks that were based on Arnold – vacuformed – beautifully done Halloween masks of Arnold’s face for the stuntmen. There are times when it’s moving and in action in a medium or long shot it’s hard to tell the difference! Once you have the wig and mask on…the Terminator doesn’t make a heap of facial expressions! Those ended up being very helpful to both digital and stunts to have many Arnolds around on set.”
Thousands of silicone and prosthetic appliances crafted by Legacy accompanied the shoot – arranged for scenes where The Guardian would be shot or clipped. “We had a make-up team in New Orleans,” says Rosengrant. “They became the Terminator maestros. What some of the make-ups were, were little bits of partial make-ups with greenscreen painted in there where the digital team would come in and put in the missing information or carve it in. This is something we started on T3 – the beauty of it is the digital hybrid with practical make-up and what you can take away digitally – it gives us the effect we were striving for in T1 and T2 but now you can do all seamlessly.”
Legacy also built a prosthetic dummy of the 1983/84 T-800 (as used by MPC for some reference) that served as a prop for the damaged terminator wrapped in plastic held by Sarah and The Guardian. “That was an interesting process because what we did was start with a life cast of that time period and we started with a body mold that we had from T3 of Arnold,” explains Rosengrant. “Although that wasn’t Arnold in his T1 prime – which was just a few months removed from one of his last bodybuilding championships – at least it gave us the right proportions. So what we did was a clay press out of that body and we got a head sculpture based on his life cast from the time period. Then Jason Matthews, a lead artist and sculptor, he sculpted on top of that body of Arnold a version of him as he looked in 1983. So we had all these pictures plastered up on the wall of Arnold’s last body building competition and whatever photos we had left – and shared from Matt Winston from Stan’s days. So we made a replica of Arnold from back in 1983.”
Back to the top.
Knives, stabbing weapons
On screen: Also appearing in the alternative 1984 is a shapeshifting T-1000 (Lee Byung-hun) which attempts to re-acquire Sarah, Kyle and The Guardian.
VFX: DNeg took on the liquid metal effects required for the T-1000, which has the ability to replicate anything it touches, absorb bullet hits and produce sharp objects such as stabbing weapons – just like the character famously portrayed by Robert Patrick in T2. “The intention was there to respect the 1991 version, but back then of course they didn’t have all the amazing fluid solvers we have access to,” says Bebb. “There’s a reason it looked like what it did – because they literally sculpted everything based off of concepts approved by Jim Cameron, it was a lot of different blend shapes.”
Interestingly, DNeg did embark on fluid sim tests for the T-1000, but also found that these could be difficult to art direct into the appropriate shapes the character conforms to. “You can stop a fluid sim half way through, though, and get really nice imagery or form or sculpt,” says Bebb. “The shot where he smashes through the cop car and forms on the hood of the bonnet – that’s all just pure sculpt – literally it was just like the T2 methodology for that similar helicopter shot.”
For shots in which the T-1000 was in liquid metal form, the filmmakers sought to properly represent the surrounding environment reflections. “We had a 6K RED camera with a 6mm lens on it that was shooting all of the environments,” explains Bebb. “We had 360 degrees of reflections for every shot, correct moving reflections – bespoke plates shot for just him – and also HDRIs.”
The T-1000s blades were always computer generated, although DNeg modeled them from original Legacy Effects reference. One challenge with the blade was working out their appropriate size depending on angle to camera. “There’s a sequence in the department store where he grows his blades directly into camera,” describes Bebb. “We must have done hundreds of versions of that because it didn’t look right for a whole. Essentially what you have is foreshortening – even if you have the blades correct they just look like chicken drumsticks, they look stupid. So in that shot I think those blades look about 6 feet long!
Ultimately, Sarah and The Guardian lure the T-1000 to acid-rigged sewer location and destroy the terminator. On set, Byung-hun mimed the disintegration, which DNeg than roto-mated, and also relied on additional mocap. “It was really a ‘poor man’s mocap’,” states Bebb, “knowing that we was the T-1000 while doing it and he might be in a different state of demise, lost a limb or a knee falling off.”
“We did his Endo arm as well for that sequence,” says Bebb, referring to the moment The Guardian pushes the T-1000 under the acid to reveal the mechanical arm. “The biggest thing for that was that the fanboys didn’t think it was big enough! But weirdly enough the Endo arm was built off a scan of his arm. It makes me think that the original Endo arm was far, far bigger. Essentially all you’re seeing is bone so it’s always going to be a little bit spindely – it’s got to fit in the hand.”
In Terminator 2, the -T-1000 was destroyed in a vat of molten steel, ‘morphing’ between the various forms it had taken. For the acid bath sequence in this film, the filmmakers devised a new ‘twist’. “Yes we’ve seen it melt and blend away,” says Bebb, “but we wanted to give it a different state of material transition. This is not so much fire that’s blowing it up, it’s acid, so we wanted to get the burning in there. We looked at the way that acids burn metal and ingots of aluminum. It breaks down into liquid metal and then it starts to burn, so you get this phosphorous flame, a lot of smoke, then it almost dries it out like a pumice and becomes rigid and bloated. Then it shatters to dust. That was a massive R&D effort. At the end of that, we had to do almost complete creature sculpts because you think he’s dead, but he’s not. We looked at The Thing and the goriness of it where it goes after Sarah for the last couple of beats.”
Back to the top.
Not man, not machine: more
On screen: After time traveling to 2017 in the hope of once again resetting the future, Kyle and Sarah are shocked to see John, who ultimately is revealed as ‘nano-fied’ terminator – a T-3000.
VFX: Realizing the ‘nano-fied’ look involved significant design, concept and R&D work from DNeg. “For want of a better word,” says Bebb, “the nano-fied version of John Connor is almost like an infection. Skynet can’t kill him so they decide the only way to do it is to control him. He’s replaced at a cellular level by nano-mites that take on his form. It looked a little bit like Jason Clarke, albeit a combat, stripped hybrid of the two of them. If you take the human form, it’s actually pretty well designed for what it does. If you were to take it up a notch to a combat level and take out all the fat, literally, then you get something which is far more streamlined and a pure thoroughbred of the two, what’s best of the organic and the inorganic side.”
The T-3000 is able to break down his mass, something achieved by controlling magnetic fields – an aspect that is revealed when he ventures too close to an MRI machine, and also used by The Guardian at the end of the film to successfully destroy him. The final nano-fied look was more matte, than metal, as in the other terminators. “The material we coined was a kind of ceramic carbon composite, with a slight iridescence to it,” outlines Bebb. “We gave it a slight tickle via the shader in different environments to make it look as good as it can, but essentially David and Alan wanted something that looked quite futuristic. We referenced the F-35 or the SR-71 – the more stealthy aircraft. It does have a slightly satin-like finish to it with a small specular kick.”
In its fully nano-fied state, the T-3000 was brought to life as a combination of witness camera capture of Clarke, keyframed animation and motion capture for the final fight towards the end of the film. “You’re essentially doing something which is a hybrid of Jason, so it didn’t need to exactly match him,” notes Bebb. “For the dialogue, we had a huge amount of tracking markers and witness cameras all shutter sync’d. But for the T-3000’s performance, we wanted to give it a slightly perfected look to bring in the machine side of what is the T-3000, otherwise it would just look like Jason walking. So every punch he does is a little bit harder, faster, straighter.”
Since the T-3000 had to be shown as an almost living mass of nano-mites, DNeg worked to give the terminator a constantly moving surface in Houdini. That setup had to also interact with fire sims for a shot of the T-3000 being blown up and emerging from a bunker. “The entire model is built in geometry,” says Bebb, “and then converted into hair so it could then be dynamically run. It’s about half a billion hairs. It has a pseudo heartbeat that runs through it like a pulse. There’s a different level of re-purposing mass to get that type of look when your muscles contract and expand. It’s almost a living/breathing CG thing, which you can key, but a lot was happening automatically in the background as an effects sim.”
“Janek coined that he wanted it to be ‘renderfam busting’,” adds Bebb. “And it was! It was initially 20-odd hours to render per frame, so we had to strip it down and modify it somewhat to get it to work. On screen it’s an incredibly highly detailed model. It has a blood view, it has muscles, it breathes – and then as it becomes more combat-effective, it realizes it doesn’t need all those things anymore. We also dialed in the ability for each of these hairs to be driven by any number of different rules.”
At times the T-3000 is shown as part-nano-fied and part-Jason Clarke, including when he is affected by the MRI scanner. “He’s essentially being pulled apart and stripped of his layers,” explains Bebb. “That was designed based on sound as well. We could plug the sound design into our model and it could all react accordingly. That’s what you see as the layers are getting stripped out, and every little bit is being pulled and pushed and drawn out is all being done dynamically through sound.”
DNeg was occasionally required to produce a digital double of the actor. For that (and for other digi-double performances), the studio relied on its on-set Photobooth setup. “We took the actors’ photographs on set, then did all the FACS poses, expressions as well, and took the costumes and scanned them,” describes Bebb. “This time around we did pretty much all the FACS expression, over 140 on each, but Jason in particular. And we did polarized and diffuse lighting. All the characters were then dev’d in PRman and Houdini because we needed them in different setups.”
Several moments called for the T-3000 to merge into John Connor form, and back again. “Every time you get a re-form, he’s re-purposing material, his skin and flesh and clothes, and that’s you see,” states Bebb. “We gave it a slight iridescence for the camouflage side of things. We looked at nature, say the cuttlefish, and how all the different pores and apertures close and they find the color, but not straight away – there’s a little bit of mimicking the environment around it that’s not quite right. So we had multiple passes that we could control to allow us to rebuild those layers instead of just a straight blend like in T2.”
Back to the top.
Golden Gate grind
On screen: Sarah, Kyle and The Guardian take to a school bus on the Golden Gate Bridge, but are pursued by the T-3000.
VFX: DNeg combined plates filmed at the real location in San Francisco and on a 500 foot bridge set in New Orleans, with digitally-produced bridge and bay environments, and helped stage scenes in which The Guardian is flung from the bus into a police car, and where the T-3000 causes the bus to flip in mid-air. “We tried to keep as much of the principal photography as we could, because they really did flip a bus,” notes Bebb. “But we also modeled it in CG.”
“For the environments,” continues Bebb, “we went to the Golden Gate and scanned it and took pictures, and built a pretty decent model of the entire bridge to use. And then we took that piece – the 500 foot section from New Orleans – and retro-fitted that into our Golden Gate Bridge. The GG sequence actually grew quite substantially, and that meant we had to do a bunch of car crash activity digitally.”
Back to the top.
On screen: Remanded in custody, Sarah, Kyle and The Guardian escape the police headquarters via helicopter on their way to Cyberdyne but are once again pursued by the T-3000, also in a chopper.
VFX: DNeg delivered the helicopter chase, which included several dramatic stunts – a fuel tanker explosion, The Guardian launching himself as a ‘human missile’ against the T-3000’s craft and a final crash at Cyberdyne. “We spent a month after main unit wrapped in San Francisco and shot plates for the sequence,” says Bebb. “We had a film unit for VFX plates, and then we also did the usual thing where we scanned, photographed and scissor-lifted, so we had a CG model of San Francisco. Most of our helicopter battle is street level, and maybe 10 to 12 storeys up.”
Back to the top.
A new Skynet
On screen: It transpires that a computer operating system known as ‘Genisys’ from Cyberdyne is actually the precursor to Skynet, so our heroes look to blow it up, along with the T-3000.
VFX: Skynet is represented in its earlier stages as a child hologram-like light sculpture that enjoys a rapid evolution into the form of the T-5000 (Matt Smith), who had earlier attacked John in 2029 as Kyle entered the TDD. The light sculptures were effects crafted by DNeg, as was a battle between The Guardian and the T-3000, and the destruction of Cyberdyne’s headquarters.
“Skynet starts off as a child and then you see this growth all the way through the scene in the third act,” says Bebb. “It’s not quite a hologram because it’s not really projected. It’s more like lasers that are ionized air, like a light sculpture. We made that with particles and fluid solvers. There were huge amounts of particulates that all had various dynamically driven movement, but it had to look like Matt Smith so we had to be pretty close to the skin in order to read his performance.”
Following a fight in the maze of servers at Cyberdyne, The Guardian battles the T-3000 amongst a partially finished TDD, which also produces a magnetic field. During the confrontation, The Guardian sustains significant damage to his body and face – Legacy Effects worked on prosthetics for throughout the film, while DNeg also used witness cameras and tracking dots to show pieces of Endoskeleton coming through. The unstable TDD blows the T-3000 and the Cyberdyne building (The Guardian survives by falling into a vat of a vat of mimetic polyalloy liquid and emerges with liquid metal upgrades).
Exterior views of the Cyberdyne explosion featured plates of the Oracle headquarters. “We went out early in April and did a photo scanning shoot of the Oracle offices and surrounding area,” recounts Bebb. “Main unit went in July/August for main unit shooting and we did more scanning once it was dressed. But we knew it had to be built for destruction, so we did. The sims were so big they had a week turnaround.”
Back to the top.
All images and clips copyright 2015 Paramount Pictures.