What a year for Wētā Fx, – they have nominees represented in 3 of the 5 nominated films for Visual Effects. We spoke to Wētā VFX sups from all three films on the morning they were nominated.
This morning in LA the Nominations were announced, and fxguide would like to congratulate all the teams of artists that worked on each of these great 5 films, and of course the nominees themselves.
Visual Effects
All Quiet on the Western Front
Frank Petzold, Viktor Müller, Markus Frank, and Kamil Jafar
Avatar: The Way of Water
Joe Letteri, Richard Baneham, Eric Saindon, and Daniel Barrett
The Batman
Dan Lemmon, Russell Earl, Anders Langlands, and Dominic Tuohy
Black Panther: Wakanda Forever
Geoffrey Baumann, Craig Hammack, R. Christopher White, and Dan Sudick
Top Gun: Maverick
Ryan Tudhope, Seth Hill, Bryan Litson, and Scott R. Fisher
It certainly has been a good morning for Wētā, in the land of the Long White Cloud (NZ). The 95th Oscars is a particularly strong year for Wētā with 6 of the 20 nominees being from the company. This is Joe Letteri’s 12th Oscar nomination (!), Eric Saindon’s 3rd, and Dan Barrett’s 4th for Avatar alone. Also, for Wētā, Chris White received his 3rd nomination for Black Panther: Wakanda Forever, and Anders Langlands received his 3rd for The Batman.
Avatar, The Way of Water- with Eric Saindon
We spoke first to Eric Saindon, nominated for Avatar, The Way of Water. Eric spoke to us from a boat fishing with his son in Wellington harbor, just after sunrise. Commenting on the companies strong showing this year, Eric expressed that ” it is pretty remarkable to able to work at a facility that can achieve this. It seems like every year we get work on more of these great projects,.. and work in such a way that we don’t just push them out, but in a way that we can take great pride in our work.

Avatar: The Way of Water is clearly a sequel, which presents a real challenge in visually connecting with the first film, but also being new and different. The first Avatar won the Oscar in 2010, so the visuals were generated with technology of an entirely different technological era. Eric feels one of the big differences in the new film is how many characters are giving faithful and nuanced performances this time around. “In the first Avatar, we were able to put some really great performances on Zoe and Sam, – the main characters. You felt for them and they had all this extra detail and the character,” he explains. “But on this new film, with all the new technology, we’re able to enhance that, to get even more out of their performances, but now all the background characters also have great performances – and I think that makes a big difference.”

Epic enjoyed the Oscar Bake-off in LA, but the scope of the work on Avatar meant that it was hard to cover all of the innovations in Avatar, The Way of Water. On the day of the Bake-off presentations, Joe Letteri pointed to several such as, the new facial system (see our fxguide story), underwater performance capture, the new water system (and how it works and drips over skin), and onset real-time performance preview with depth. “It was funny because all the different films were talking about the various different technical things they did. And I was sitting there with Joe and we were like, ‘oh yeah, we, we did that too. We forgot about that’. Two examples were the use of LED volumes and the new wire/eyeline system.
Eyeline system
The new Wētā eyeline system sort to improve on the old ‘tennis ball on a stick’ trick. As Pandora’s natives are much taller than the actors portraying them, any live-action actors would need to look much higher in shot, over the head of a fellow actor, to provide a believable eyeline. To solve this the team took the animation information of where the digital characters’ head would be and using a cable system- not unlike a SpiderCam at a football stadium live coverage, hung a dynamic small monitor with the performance of the second actor on it. In other words, the live-action actor would look up to a floating small monitor with the other actor’s performance. This moved correctly around the set in real-time thanks to the elaborate computer-controlled wire rig. “The eyeline system, was like an NFL game – that flies by wires over the top of the game,” Eric explained. “Ryan Champney (Lightstorm Entertainment(LEI)) and Casey Schatz (The Third Floor) basically built this system that would take the motion from one of the characters and plug that into the eyeline system.” The system takes the head moves from a digital character and provides a little monitor hanging in the right place. “So when Jack Champion was acting on set, he was able to act to this eyeline system. It gave him something that would talk back to him, move back to the right location, give him a proper sense of space and where he should be looking, plus the timing for his lines.”

“The eyeline system is a modified Superacam that carried a small screen and bluetooth speaker,” Casey explains. “LEI’s Ryan Champney did all the engineering and software modification to make it film-set ready, including the video/audio streaming, remote triggering, and adding a servo motor to give us an additional Yaw axis for the actor to look around.” The Third Floor’s Casey Schatz did the on-set integration by constraining the monitor rig to the mocap animation and exporting this to the system. “The trickiest part was finding an arrangement of the 4 pick points so that the wires did not sweep through any of the set or lighting gear.” Much of this was accomplished by repeated LIDAR scans of the set during pre-light and was a close collaboration with the film’s gaffer Len Levine and key grip Jay Munro. “When it was done though, it was the next best thing to having a real Navi with us on location,” Casey adds. “The monitor moved in 3D space along with the audio and video from the mocap session.”

LED Volumes
While much of Avatar is computer generated, it also relied heavily on the live-action performance of Jack Champion as Spider. “We used the LED panels because with Spider, when he was on the surface, we wanted proper reflections in the water so we didn’t have to paint out the water, and we would be able to keep the water,” Eric explains. “We used LED panels all along the edge of the water to get the proper reflections….we also used it on the Sea Dragon sinking because it was a big wet ship. We wanted to get the proper reflections and bounce into the ship itself. They really helped to give us some nice reflections and lighting onto the characters as they walked around the environment.”
Black Panther: Wakanda Forever -with Chris White
R. Christopher White was nominated for his work on Black Panther: Wakanda Forever. Wētā contributed greatly to the Dry-for-Wet and underwater sequences of the film. As they presented at the bakeoff Chris and his team relied heavily on Wētā spectral renderer, Manuka to provide accurate underwater visuals. Clearly, with the work also being done on the Avatar films, Wētā has one of the most advanced underwater rendering systems in the industry. Spectral responses are particularly important in transmissive volumes and be they on Pandora or Earth, the ocean’s waters react very differently to different wavelengths of light.
The reality of underwater photography vs. providing images that would tell the story and allow the audience to see the actor was discussed from the outset of the project. In reality, if one actually films down deep, and at any distance to the subject, visibility is very very low. “This was something that was raised very early on,” comments Chris “With our spectral renderer we wanted to have a realistic baseline. At Wētā, we have sample readings of different oceans around the world using this measurement that came from the 1970s, called the Jerlov measurements. ” Starting in World War II and continuing until the mid-1970 ′s the U.S. Navy extensively funded research in visibility. This effort laid the groundwork for much of Ocean Optics as they are used today. The International Association for the Physical Sciences of the Oceans (IAPSO) nomenclature of Jerlov, published in 1976, is the difference between the target and background radiances at a given wavelength and it is a combination of beam attenuation coefficient at that wavelength and the range from the observer to the target.
Wētā has these actual different water types from around the world and they converted these spectrally so that they had available to use. “Part of the technique that we had with Jeff and with Marvel was to always have the actual realistic spectral baseline to refer to,” Chris outlines. “It was part of our approach that we had built-in tools that allowed us to have additional creative controls, but also have the ground truth of completely realistic water. Then we could decide how far we want to deviate from it to meet those creative goals. But we always had ground truth to go back to it, so if it ever seemed to not look quite right, we’d say, let’s look at what the, what it should be doing.”
The meso American Mayan culture behind the kingdom of Talokan was rich in the color Red, the Wētā team faced a difficult problem. Red is the lost underwater due to the spectral shift, yet the filmmakers thought it important to still see some red in the underwater sequences. This was solved by both enhancing the models – and intensifying their colors and tweaking the Manuka Spectral renderer. In fact Wētā “had to start doing our turntables underwater because if you were to look at their architecture above water and sunlight, it would look nuclear red,” Chris recalls. They also adjusted the light transport. “Sometimes we would adjust the absorption of the water so that the red was lifted, and we got so specific that sometimes we are actually even controlling it on which (ray) bounce. As the light’s coming out of a light, it doesn’t absorb red, but then after it hits a specific surface, it then starts absorbing red until it hits the camera.”
There were many other complex water effects, controlling the flotsam and jetsam suspension in the water, drop off, and light motivation. What proved important was having an accurate pre-viz. Here the team used Wētā Gazebo renderer which allowed for a simulation of the final lighting pipeline, so that “when the animators are setting up previs and post viz, the lighters would jump in and figure out the water settings, adjust the lights and just set it all up such that when the animators were working they were seeing what our final render should look like,” Chris comments. “Adjusting turbidity and the visibility of the water, things like that. So our previs was pretty advanced and postvis.”
The Batman – with Anders Langlands
Anders Langlands was nominated for The Batman, this is his third nomination. In The Batman, Wētā contributed heavily to the Batcar chase, one of the most visceral and impressive sequences in the film. While the film heavily used LED volumes in other parts of the array of visual effects sequences, the chase was a perfect marriage of practical and visual effects.
The sequence starts out as a primarily live-action sequence, under the incredible cinematography of DOP Greig Fraser. But as the sequence continues much of the visuals become fully digital. In the first part of the sequence, the complexity was tracking the live action and providing rain and reflections. “Everything was dry or nearly dry when they shot it,” Anders explains. “There was a little bit of rain in some of the plates, but it was sort of spitting rather than the sort of preferential downpour that Greig wanted.”
In these sequences, all the objects needed to be tracked to allow for water reflections, highlights, and rain effects to be added. This is actually an incredibly complex tracking problem, given the DOP had used modified lenses, deliberately detuned with Arri to “go soft at the edges,” Anders points out. “You get a sort of optical shift, really strong vignette, and really interesting defocused patterns which are completely different, in near and far focus, – just stuff that looks kind of crazy.” Added to this Greig Fraser had glass filters covered in a film of Silica to produce dramatic and unusual wet lens flares.
while the foreground cars in the chase were accurately modeled and then match moved to provide the right phantom object for the 3D rain and reflections to be generated on, – for scenes with 20 or so cars in the background, this was just not viable. The team resorted to using what they called a ‘clown car’ which was a dynamically altered ‘car’ which could be bent or modified to approximate the shape needed and as these were only background cars – the focus was on matching headlights and tire contact points much more than the actual car design.
As the CGI car sequences needed to intercut perfectly with the live-action sequences on key aspect that the Wētā team focused on according to Anders was the motion blur. The team did sub frame tracking so the streaking motion blurred lights of the action sequence would stretch in the correct shape and not just a series of straight lines. If the motion blur is just between two frames, it is visualized as a straight line by normal VFX software. But if the car was rotating, the motion blurred lines would be curved as the object moved in a circular path during the time the camera’s shutter would have actually been open. “We decided to track sub-frames. Rather than just tracking the frames as you would normally do. We tracked sub-frames in order to get the right shape for the motion blur streaks. It is particularly important I believe when you’re dealing with any kind of forensic action. It’s something that’s often ignored I think a lot of the time,” he explains.


The Oscars will be awarded on Sunday, March 12, 2023, at the Dolby Theatre in Hollywood. The ceremony will be televised live on ABC in the USA and in more than 200 territories worldwide.
This year Jimmy Kimmel will host, he previously hosted the Oscars in 2017 and 2018.