Joining a strong trend amongst the world’s leading animation and visual effects companies, Animal Logic has developed its own in-house renderer, Glimpse. The name comes from its origins as the first tool for lighting leads and TDs to “have a glimpse” of what something might look like. The project grew from this early limited application to a full production renderer.
Glimpse is a unidirectional path tracer (with some specialist quasi-bidirectional aspects – see below). It is now the in-house renderer used throughout Animal Logic both in Australia and Canada. It was grown over the last 3 years into a full production renderer for both animated features and live action VFX.
Guy Griffiths, Director of R&D at Animal, is now keen to see the team get some well deserved credit, and also highlight the technical innovation inside the studio. This comes after years of the company focusing primarily on their artistic achievements rather than being known for their technical innovation. Animal Logic has grown from a TVC effects house to an animation vendor and now a major animation studio with such hits as The LEGO Movie and a real agenda of producing – not just servicing – the feature film market. The group does still have a strong VFX component with projects such as the effects in the new Allegiant film, and this is important as Glimpse is now used by the entire company.
Glimpse aims to produce the highest quality images, as any production renderer, but it is also about facilitating artists making those images and being creative. For the team Glimose has to be workable and produce art, thus ideally it is the one tool that is the same everywhere in the pipeline. “What we want is every artist sitting in front of a screen with the same embedded path tracer,” points out Griffiths. “We are pulling the renderer as close as we can into every viewport, of every artist in the company.”
The new Batman Lego movie – rendered all in Glimpse.
History
2011
On the 5th of January 2011, the day after returning from Christmas holidays, Max Liani, then a Lighting Supervisor, started worked on a physically based shading system name PHX. “I wanted to give the lighters some edge. I wanted to push the boundaries of the quality we could deliver within the budget we had. In my own time I began developing an experimental interactive path tracer under the name of Glimpse,” explains Liani.
2012
“PHX was the initial dry run of how we would do advanced light transport and ray tracing with physically based shading,” comments AL core developer Luke Emrose, looking back on one branch of the project’s origins. At this point PHX was just a PRMan shading extension which was not intended to be the basis for an entirely new production renderer. It was used on both Walking with Dinosaurs and The Great Gatsby. At this time, RenderMan was not shipping with a physically based shading system so such an extension made sense.
Liani was working to give his team on Walking with Dinosaurs faster lighting so they could get shots done as quickly as possible. The initial version of Glimpse could only do grey shaded models with area lights and full ray tracing; it did allow the lighting artists to very quickly glimpse what the final lighting would look like when all the texturing and other elements were finally added. It was used as a preparation tool, but the full lighting and rendering was still done in PRMan.
In late 2012, a new version of Glimpse was finished and it started to be used to help deliver actual shots, but not as the final renderer. It was being used later in the production pipeline, but it was a long way off replacing Animal Logic’s use of PRMan.
2013
In early 2013 the team started working on The LEGO Movie and the need for a more complex rendering tool became apparent. LEGO may appear as a simple film to render but actually it was complex. For example, the plastic bricks are rendered with sub surface scattering (SSS) and there were a LOT of bricks in the film. As the vast number of bricks are mostly highly reflective, a solid but very fast fully ray traced solution was required. Tests indicated that AL’s existing pipeline would not be fast enough and the team started exploring every option for a new rendering pipeline.

In the end they decided on ray acceleration in PRMan with a Glimpse plugin in a relatively similar approach to the hybrid ray tracing from BMRT and RenderMan. They named this version: FrankenGlimpse! This hybrid solution loaded PRMan and Glimpse in the same process and then used an elaborate mechanism for sharing data between them, this allowed the team to write shading operators which could call ray tracing operations in Glimpse and give results fast enough to get the acceleration the team needed in render time.
This evolved into a complete replacement for the ray tracing sub-system in RenderMan. It was initially simple and did not deal with texturing, using, for example, single colors for the bounce lighting. When the team first started seeing the colored frames from Glimpse, the new system was adopted for turntables and intermediate renders. This proved that ray tracing, even with noise and other restrictions, was a faster way to check animation than their OpenGL render approaches. “In many cases we could turn around shots in Glimpse, with ray tracing, faster than we could in OpenGL from a pipeline perspective,” remarks Luke Emrose. In animation, the speed was helped with a reduced quality threshold, simplified textures, “all of which produces fast convergent results, that animators could still more easily work with,” he adds. Glimpse was fast to setup and intuitive to use from a lighting perspective.
By late 2013, Glimpse had increased in complexity to a point that it could be used for final renders. In fact, it was only weeks before final delivery of LEGO that the last of this stage was completed and things such as textures were added, as such there are just a handful of shots in The LEGO Movie that are fully rendered in Glimpse. “One of the things from my perspective looking at the development,” says Griffiths, “was that the team was incrementally adding features as they could to speed things up, and in fact, they were still working on things two weeks from the end of the film, adding support for faster SSS for example – which sped things up a lot. So innovation continued right throughout the film.”
In many respects, RenderMan would end up solving many of the same problems and offer other users these exact benefits, but in 2013 Animal Logic was effectively ahead of the curve on what was released to the general public, and thus Glimpse was used on LEGO. RenderMan did have many powerful tools in PRMan 15 on LEGO that AL did use, especially the instancing which AL was one of the first studios to use in production, and using them very aggressively – “given LEGO is perhaps the perfect film for instancing”, jokes Daniel Heckenberg, another core developer of Glimpse.
2014
By the time LEGO finished, the team had a production renderer – sort of. Following the film’s wrap there was a period when AL wanted to pause and define a more structured road map moving forward. In 2014 the team started a program of consultation and planning with the supervisors and key department heads to define the roadmap. From these weekly meetings on future direction, the team decided to commit to a full production renderer and decided the place to start was with the shader sub-system. Initially, this was to be based on OpenSL, but it has since grown beyond this. For example, the team decided to develop ‘Ash’ which is their graphical nodal shading construction system. Ash has a lot of heritage from PHX, but Animal implemented it completely differently. PHX informed Ash but differed from it. “Ash lives essentially as a combination of Glimpse implemented BRDF / BXDF substances and OpenSL shader pattern networks,” explains Heckenberg. Ash takes advantage of AL controlling both the shader components and the render components, in a way PHX could not earlier.
“By the end of the film we were confident that we had something pretty special,” comments Griffiths. “There were a lot of challenges with LEGO, but it worked both technically and commercially which we were very proud of. And as a result we knew we had sequels and that would mean a bucket load of bricks!”
The period of consultation, design exploration and innovation following LEGO as AL planned their future was particularly rewarding professionally for the team. “During 2014 we had a period of time while the next films were in pre-production that gave us time to plan development. There was an eight week ‘production bedding down process’ at the end of that and I have to say it was one of the most exciting periods in my career,” Griffith proudly explains.
As mentioned, SSS was key from the first LEGO movie. The original SSS in Glimpse was a standard dipole model (2013), later in the 2014 revision, when the team decided on the need for a higher fidelity SSS, the team added a Jensen single scatter approach. They explored directional dipole models but decided on a single scatter solution, as it was easier to get a more accurate single scatter component. The current system is a Jensen single scatter and dipole model, and as actual LEGO pieces surprisingly display strong SSS (especially in closeups), this is actually a key point for the team. This approach has meant that they have gone down a path more similar to the Arnold renderer (Solid Angle): a cache free ray sampled approach – away from any cache based approach. This system scales well for vast amounts of LEGO bricks and character work generally.
By mid 2014, the system was in use not only on feature projects but was being used for images on the front cover of Vogue Magazine. By the end of 2014 the team delivered their first tests on a forthcoming LEGO film.
Once the team had committed to turning Glimpse into a full production renderer they set to filling out the specifications, following the shading system, they addressed volumetric shading, curve rendering (used for hair and fur) and adding other more general extensions.
2015
In January 2015 the team showed the results of the LEGO test to the rest of AL and “they were blown away,” recalls Griffiths. While that test is not public, it involved “a lot of organics – it’s not just bricks!”
After the start of 2015, the team moved to a concept of Render Everywhere, with the goal of making Glimpse the one unified renderer to be used everywhere inside the company.
The team are quick to give credit to Matt Pharr and Greg Humphreys for their PBRT book, as well as Eric Veach for his inspirational work of path-integral formulation of light transport and MIS (see below). They also consider the various open source initiatives such as OpenSL and Embree important inspiration.
The team also thank the commercial render team at Pixar and Solid Angle who publish and openly discuss their research. “It is a really exciting time to be working in rendering,” says Emrose, “with everyone having a pretty open approach to sharing. For what are basically a set of competitors, there is a huge amount of sharing between various houses and successful renderers, and that is moving all the commercial products and renderers like Glimpse forward. Animal Logic is really keen to contribute back in any way we can to that.”
Under the Hood
Deep Comp: MIA
Deep compositing is a surprisingly controversial point inside Animal Logic. Glimpse currently does not support deep data. Fxguide’s first ever story on deep compositing was with staff then working at Animal Logic, and the sci-tech awards honored ex-Animal Logic (and Weta Digital) staff, so why then would Animal Logic’s own Glimpse renderer not support deep data and the company not embrace a deep compositing pipeline, especially as they have used it in production in the past?
“Glimpse does not yet support deep output and that is partly due to the experience we have had with deep over the years,” explains Heckenberg. “There are the normally noted concerns such as the amount of data, the expense it adds to both rendering and compositing – by that massive data magnification – but one of the things that really came out of our internal meetings was just how much clean up of things that deep should have handled, were actually a problem in The LEGO Movie.
Ironically, deep comp, while promising control to comp, actually provided the compositors with additional “deep cleanup” work on operations such as defocus, to remove unwanted artifacts according to Matthew Reid, core developer at Animal Logic. “On The LEGO Movie all the lens depth of field effects were done in comp. The comp pipeline on LEGO reconstructed scene visibility from the deep data, which in turn produced a lot of artifacts that the compers needed to deal with.”
This labor intensive cleanup caused the team to avoid deep comp lens effects, post LEGO. Instead those lens effects are currently baked into the render and deep compositing is not currently supported from Glimpse. Similarly extreme lens distortion is being handled in the render, both for image quality (avoiding post-processing softening) and also render time.
Glimpse has a parametric lens distortion built into their camera model, which is significant in the context of Glimpse’s fundamentally uni-directional ray tracing approach. The ray differentials are distorted in the renderer, which means the team can sample all their textures at their rendered resolution.
One of the things one seeks from a correctly defocused highlight is a much bigger bokeh highlight. In effect, a pin dot of bright light needs to become a bright disc in the render when that pin dot is way out of focus. This gives the classic beauty shot with a shimmering set of highlights behind so common in long lens night shots. Given Animal Logic’s approach of not using deep comp defocus this has to be solved at render time. The way it is implemented is using a parametric lens distortion, the team distorts all the primary camera rays, which fire off all the rays unidirectionally. The renderer discovers highlights in the scene, this triggers an energy re-distribution model to pass these highlights back to the lens. In a sense, the renderer is using a unidirectional path tracer but the highlights – once discovered – are fed back in more of a bidirectional path style to allow for a bright bokeh ping. The energy redistribution is used to sample and complete the render since energy conservation is central to physically plausible models. It also solves the noise issue, without the energy redistribution there would be classic undersampling which in turn manifests as noise around the highlights, (noise or fireflies).
“Glimpse is a unidirectional path tracer with one special case of tracing in the opposite direction to improve sampling of lens defocus. We use a technique derived from ERPT as described by [Cline et al 2005] specifically for efficient sampling of out of focus areas which allows the tracing and shading expense of path samples to be amortised by projection over the defocused region of the sensor,” explains Heckenberg.
Cameras
Not only is Glimpse able to handle matching normal production cameras but it has been built to do spherical or non-planar images. This is often used in VR work and immersive 360 environments (see this fxguide story). “We do all sorts of more general camera models,” says Heckenberg.
Sampling
Central to the success of Glimpse at Animal Logic is that the only difference between preview and final rendering in Glimpse is the amount of samples used and the order in which they fire. This moves the final look as early as possible in the production pipeline.
Multiple importance sampling (MIS) is used throughout Glimpse. “We use it everywhere, it is the workhorse trick that makes path tracing a practical technique. It allows incremental research to be incorporated into an existing path tracer in a very elegant fashion,” says Heckenberg. In many respects Glimpse is another renderer underlying how significant the work of Eric Veach was in his 1997 PhD. The team give him full credit as one of the “key pieces of literature in this crazy world of ray tracing,” jokes Emrose. “And it will remain a huge contribution for many years to come!”.
“We use MIS wherever possible as a variance reduction technique,” adds Heckenberg. “Multiple-sample MIS for light and BDRF sampling is the most widely discussed case and very important for glossy surfaces, but we also use MIS elsewhere, for example one-sample MIS for spectrally-varying sub-surface scattering.”
Animal Logic has full Maya and Houdini integration for Glimpse, and this has been used in both fully animated films and upcoming VFX live action films such as Allegiant, which Animal Logic is the principal effects shot vendor. (More in a future fxguide article coming).
Volume sampling: the volumetric model does use OpenVDB and uses a distance sampling algorithm that works well for smoke and clouds but it is not an advanced area of research for the team. Regarding emitting and emissive lights, there is work being done for a more advanced importance sampling for an upcoming VFX project that does have a lot of fire and smoke. Volumes do provide lighting in the scene currently but this new project requires a much more complex solution and the team is hard at work coding that currently, the new solution will use MIS and thus have relatively lower noise (converge faster).
Open Source
Modern development teams are faced with an enormous range of very valuable and powerful open source options that offer both standardization and a short cut to accepted production solutions: things such as Alembic, USD (Universal Scene Description), OpenIO, OpenSubdiv, Open VDB and more.
The team has critically explored most of the open source render tools, and have embraced several of them extensively, while rejecting others. In addition to OpenSL mentioned above, AL uses OpenVDB for their volumetric support, Alembic is used, and the team have looked extensively at Open USD (Universal Scene Description), but here the story becomes nuanced. AL deploys something they call CSD: Common Scene Description, which is not the same as USD, but has related goals.
Inside Animal Logic, CSD is effectively the scene description that feeds into the renderer. By contrast USD is assumed to always be interpreted or dealt with one more time before it is rendered. USD is not a direct path into any renderer. AL’s CSD is also facility wide, it is at the base of their scenes descriptions. A Glimpse scene is more like a game environment where the Glimpse scene exists and the renderer is just one client of that scene, it is live and can still be updated vs. a more traditional RIB style read only scene file. “Glimpse is very fast to start up, very low cost to use – able to be used anywhere it is needed in the pipeline,” explains the team.
Glimpse and thus CSD is seen as spanning the render requirements of AL, from lighting and turntables, to look dev and final development. There is no other ‘GPU’ renderer or fast animation preview renderer or interactive lighting tool…it is all just Glimpse. “We don’t need to switch out Glimpse and use something like a GPU renderer for preview something and then guess what the final will look like, we are always getting the same fidelity,” says Emrose. This process of wide scale adoption was completed in the first half of 2015. Following that the team added more tools to handle the requirements of the photoreal effects needed in their work for Allegiant.
A great example of VFX Glimpse work in this exclusive before and after clip from Allegiant
Performance
“The renderer is the thing that produces the images that we give to our clients, – having control over that gives a huge amount of flexibility – which we are capitalising on.”
Glimpse is key to production success, there is no other agenda, Animal Logic wants control over Glimpse so their films can be better. To do that the team needs Glimpse to be fast. Glimpse is fast, really fast, and extremely scalable. For example in a stress test, a scene effectively containing 55 billion LEGO bricks renders interactively (12fps) with an ambient occlusion integrator at 1440*810. Mullti-level instancing is used to keep a compact memory footprint.
Glimpse interactive rendering does not render low res/pixelated images to reach interactive frame rate. “I find it very distracting when renderers switch to pixelated, low resolution and/or cancel a frame render iteration when changes are made to the scene (like starting over to render from the center or the top instead of completing the frame first). I didn’t want that behavior in Glimpse,” comments Liani.
Glimpse renders at full resolution all the time and it is fast enough to do that. For example, with production assets and complex shaders artists typically get a frame iteration (1 sample per pixel refinement) per 1-2 seconds. With well crafted shaders without excessive layers and layers of textures or noises etc) and few lights, Glimpse can easily output 5-10 frames per second on production scenes. For look development on a single asset an artist can get over 20 frames per second at 2k at their desks. A simple classic test scene rendered at 1k resolution with 10,000 mesh spheres over a plane with ambient occlusion was tested at over 70 to 100 frames per second.
Looking forward
Noise
With Allegiant the team brute force rendered to convergence the final images, but an option seen recently in renderers such as Hyperion from Disney is to also have an intelligent noise reduction, that uses the various outputs of a renderer to provide much better than a direct image processing de-noise of ray traced images. Animal Logic is planning on doing something similar this month with Glimpse. With regards to adaptive sampling, “that is currently the most important technique we use to effectively target our rendering time towards removing visible noise,” says Heckenberg. “First implemented in the PRMan-based PHX physically-based shading system and then re-implemented for Glimpse, it provides a very simple and intuitive control for achieving consistent perceived noise levels.”


Checkpoint
Glimpse runs on Animal Logic’s renderfarm which means it needs to run very efficiently, since in 2016 Animal Logic will have production rendering being done for at least two major animated LEGO films, in addition to any VFX work they do. The scheduling software is also an in-house tool, and the team have already release a second version of this, with a third build planned to support the stop&start “checkpoint” feature that path tracers can offer. Given the nature of path tracing ray tracers it is possible to run them for X amount of time and then pause – review and then perhaps continue the render. This is a feature of say RenderMan and one that the Glimpse team aim to implement this year. Farm resources management is a key component of the technology agenda at Animal Logic, according to Griffiths.
Colour Space
Animal Logic does use an ACES centric style pipeline now across all shows, this actually started with the first, original Lego Movie. Currently, the surfacing and rendering pipeline uses a linear P3-D60 working space, which mixes the P3 colour primaries from the DCI spec with the D60 whitepoint of ACES. Glimpse and all other image viewing applications use the ACES Display Transforms for viewing images, and ACES Input transforms for interpreting live action material. As the renders flow down the pipe they are often “stepped up to ACEScg to hit notes that simply aren’t addressable using the more limited P3 display gamut, before being finshed in ACEScc in DI” commented the team at Animal Logic. Moving to ACEScg further up the pipe is the plan moving forward.
Switching to the high dynamic range capable ACES display transforms years ago has meant that adapting existing content and workflows to modern HDR displays has been an easy move, and is an area that holds great promise.
Spectral Rendering
Animal Logic did look at – and rejected – the idea of building Glimpse as a spectral renderer. Unlike Maxwell and Weta’s Manuka, the team at Animal Logic did not feel that their production problems would benefit from wavelength specific sampling. Spectral rendering is not currently on their agenda (Weta’s Manuka was designed to be completely spectrally based).
Part of the logic of Glimpse is to build a fast renderer without bells and whistles – without the ability to do ‘anything’ or support everything. Unlike a renderer that services many productions worldwide, Animal Logic needs a very high quality renderer that quickly serves Animal Logic. This more stripped down, fast general approach may lack some flexibility, but it is also easier to learn, faster to run and better on memory consumption than many other renderers, according to Liani who first proposed this ‘lean and fast’ model for Glimpse.
And the other Batman Lego trailer (the even funnier one with brilliant Glimpse lighting)
Conclusion
If you ask Max Liani what he is most proud of looking back over the development of Glimpse, it seems to be the journey itself. After all, Liani is not a PhD researcher, he literally just has a high school education, yet his work encompasses maths beyond most Masters students, and programming that is both effective and efficient – as it has to be to work in the trenches of hard core animation feature film rendering.
“Glimpse is fundamentally a tool designed for artists by an artist and that makes me very proud. It is very easy to use, it is fast and it scales exceptionally well to extreme complexity. It may not have all the features of some commercial products, but at the end of the day the three points I made here is what makes the big difference,” he explains.
His advice or lessons learnt from his involvement with the project are:
“One cannot understate the importance of simplicity! At first, artists were disarmed by the simplicity when they were trying to figure out how to get things done.”
For example here are some real questions from artists learning Glimpse:
- how do I tune/optimize the amount of samples on the objects/lights/shaders?
- Answer: select “adaptive rendering”, the system will figure out where the noise is and how to get rid of it (this is how PXH already worked in 2011). Other renderers have a big flowchart about how to troubleshoot noise. In Glimpse you have 1 control for the whole scene “maxVariance”.
- how do I get a custom matte out?
- Answer: add the name of the custom data you want to get out to the comma-separated list of channels (AOVs). Done.
- how do I get a refracted matte of my custom matte?
- Answer, just add “refracted_” in front of the channel name.
Animal Logic has worked hard to be seen as a creative powerhouse and not just a ‘gun for hire’, it has made the transition that so many have tried and few have succeeded at: moving from vendor to author. But along the way it has also built an impressive team of researchers and developers. In theory one can produce art with off the shelf tools, but Animal Logic’s experience underlines the very nature of our business – the need to blend art and science. It is not just about intent, it is also about providing the tools to realize those intentions.