RenderMan version 24 delivers a suite of exceptional tools for advancing look-dev workflows for feature film animation and VFX. Version 24 delivers new interactive rendering capability with the first release of RenderMan XPU, as well as new picture-making capabilities with RenderMan’s Stylized Looks. We sat down with the senior RenderMan team to discuss the state of RenderMan’s newest advances and the roadmap looking forward.
RenderMan is an Academy Award-winning rendering solution, excelling at producing a stunning range of professional imagery for feature film animation, VFX, and projects. RenderMan has had many industry-leading best-of-class innovations, but this move to GPU is a major step for the RenderMan team.
XPU is Pixar’s next-generation hybrid CPU + GPU rendering engine, rewritten for speed and efficiency on film production assets. This first phase of XPU is focused on accelerating look development for shading artists.
Until last week XPU was not included in the Non-Commercial RenderMan, but you can currently download a version for the next month that includes a limited duration free evaluation license. Unfortunately, this does not run on Mac OSX, the GPU side of XPU only supports NVIDIA graphics cards on Linux and Windows. Users must have a Maxwell class card or above and be using a recent driver. Given the memory constraints, one is best to use an NVIDIA card with as much memory as possible. Unofficially, a sensible-sized GPU for XPU in production would be a 24gigabtye card.
Artists and TDs can shade objects in the context of their scenes, get approval, and send to RenderMan RIS for a perceptually identical final frame. Physically-based materials aid accurate lighting, especially in combination with the new Color Management support for ACES in look development workflows.
Why GPU now?
David Laur, Director of Product Management, believes there are several reasons why only now has RenderMan made the move to release the XPU and embrace the GPU. He cites firstly what he characterizes as a change in the programming landscape. “The general programmability of the GPU has come so far, the general CUDA environment makes all the difference in my opinion. And it really does let us do a lot more general-purpose things with the hardware than we ever could before.” The RenderMan team has traditionally been able to innovate much faster on the CPU side of the equation due to just the programming environment.
The XPU is not just a GPU version of the CPU, it is an integrated hybrid CPU + GPU rendering engine. As such Pixar wants a general-purpose solution and one that addresses the needs of its users. That need has often been memory-limited. Memory has traditionally been a gating factor for RenderMan. “For years it was that the scene wouldn’t fit in memory. And not only that, the production shaders at Pixar work thousands and thousands of lines long. They were an order of magnitude bigger than what you’d want. They could not even hold the shader cache, and yet the great strides that have been made by hardware vendors on those fronts are amazing.”
Mark Manca, RenderMan product manager points out that from the earlier days RenderMan was designed with a focus on addressing and supporting scene complexity, ”RenderMan from its earliest days back in the eighties, was about scene complexity and being able to scale up to infinite levels of geometric and shader complexity on limited machines.” While the parameters of the problem have changed with the advent of Ray Tracing and modern shaders, there are still definite constraints on solving general rendering on GPUs. ”It’s not the same as in the REYES days, but if you look at the memory footprints of a modern Pixar film, … the average is starting to fall into line with the very high end of memory that’s available these days,” he adds. Thus, the issue has been less about production quality or if Pixar’s RenderMan team could produce a film quality or technical output and more about waiting until the GPUs could support the scale of a shot that is typically required in high-end production. “It’s tantalizing right now,” comments Manca, “In the past, if you had a two-gigabyte card, you are not going to be able to do very much in terms of what we would consider production-level complexity, but now the new offerings mean you could put a substantial number of Coco or Toy Story 4 shots onto a GPU.”
This was coupled with the economics of building GPU farms and more recently even supply-side issues about getting large quantities of chips, has been an issue. Which is why the current iteration of XPU for lookdev makes sense. Traditionally lookdev artists do have the more expensive workstations with the latest NVIDIA cards. The characterization of XPU as an ideal lookdev tool is born from Pixar’s own experiences. Of course, anyone can use XPU but Pixar found that the lookdev team inside at Pixar could take real advantage of XPU and gain real productivity. For example, while it is on the roadmap to deal with large numbers of lights, that is not fully implemented yet, “but lookdev doesn’t suffer from those issues, so that team is a good match for the current technical specification.” adds Laur. But lookdev does need lighting context. He believes that the light transport needs to inform a lot of departments, He points to a film noir scene where the character has to act in a sharply defined shadow. Here the light “is part of the performance of that scene. The animator needs to know where that shadow falls. We wanted to bring light transport to places where the GI (Global Illumination) world doesn’t provide it,” he states. With XPU lookdev can take advantage of there being accurate interactive lights. Animators can judge sheen on the skin of a character or the material of a car. “That’s at the beginning of that idea of lighting context coming to those departments to let people do their artwork in the context of lights” he explains. “Luca is a good example where they really have these particular materials they need to hit, and they need to see them interactively not just a proxy. They need to see the materials executing for real, and that’s our motivation.”
The first phase of XPU delivers a complete feature set for look development. However, some features are not supported yet but will be as XPU matures to replace RIS. For example, XPU does not yet support Lama. The new modular layered material system for physically-based look development, which came about from a close collaboration with Industrial Light & Magic. ILM use the advanced Luma system for their final material creation and as such it has been battle-tested in some of the most ambitious feature film VFX. Pixar does expect to support Lama in XPU soon. Other major restrictions in this current XPU version are the lack of AOVs and LPEs or Deep Output.
Pixar showed XPU strongly at the RenderMan user group (Science Fair) in LA at SIGGRAPH 2019, thus some expected it to be released last year. This was due to both some unexpected software development issues and “some of it was just production requirements. Pixar film production ramped up rapidly (Disney+) and we ended up working on some new RenderMan RIS developments that, just required the same development people.”
XPU supports custom OSL patterns in the same way that RIS does. In fact, most of the patterns that Pixar ships in RenderMan 24 have been converted from C++ to OSL so that the same code runs in both RIS and XPU. XPU does not currently support customization beyond OSL patterns.
GPUs are often thought to be a good fit with Ray Tracing as the problem of solving the render equation with rays seems at first glance to be an inherently parallelizable problem well suited to a GPU architecture. While ray casting is a good problem for using the parallel processing of GPU, “global illumination with less and less coherent bounces, the deeper you go, is not actually as parallelizable as you might want,” points out Laur. “The shading coherence actually falls off rapidly.” Rather than RenderMan’s move to ray tracing somehow enabled the use of GPU, Laur believes one could argue the old RenderMan REYES was more amenable to massively parallel programming than a modern full ray tracer.
Machine Learning DeNoising
In most modern renderers noise reduction and other important improvements have become possible with the inclusion of various deep learning algorithms. Oliver Meiseberg, Vice President of RenderMan comments that “Machine learning is definitely a big topic that we discuss internally. De-noising is the biggest topic by far.” he explains. The RenderMan team is currently working on trying to productize the impressive Disney de-noise that Pixar’s production team uses internally at the studio. This will eventually become a part of the RenderMan package. “This has definitely become an interesting topic, but it’s more for like final images, and the ability to run footage through a de-noise it’s not so focused on interactivity at this point.” For general workflow RenderMan still has a standard denoiser but the company is exploring in the Pixar research group, more and more ways the company can leverage machine learning.
Manca points out a fundamental point for him is that that XPU is the same renderer, but now “you’re using it in an interactive context and in the final frame…XPU is about not compromising on the richness of the feature set that path tracing offers.” This means that lookdev work done in XPU never needs to be redone. The team aimed to make sure that nothing needs to be modeled in a game engine and then redone again later. While they seek to get the XPU as fast and as interactive as possible, the team is not targeting making the interactive XPU “to ever be real-time as an actual goal,” states Laur. Yet Real-time (30fps) sync locked rendering is an issue and important production consideration. Disney sister company ILM is at the forefront of LED stages with StageCraft. Bridging accurately from RenderMan to StageCraft is therefore still an issue since the LED stages are not planned to ever run RenderMan XPU. One step towards this is robust support in RenderMan 24 of an OpenColorIO ACES workflow. While real-time engines often need to compromise on model + texture complexity to maintain perfect real-time. XPU sacrifices 30fps for the exact final production models and shaders, but only interactively and not sync locked real-time.
For its whole history, RenderMan has reflected the views of its developers as to where they think the hardware industry is placing its R&D effort. As much as the renderer is an encapsulation of the algorithmic rendering zeitgeist, it has been moulded by the trends perceived in hardware. This included the move to 64-bit computing, multiple CPUs to the same machine, faster memory networks, storage, and now the move to both the GPU and the cloud. “Starting four to five years ago, it was obvious that there were fears around Moore’s law flattening,” comments Laur. “It became obvious to us that the multi- CPU machines and general-purpose GPU type solutions were going to be where the hardware R&D dollars were being spent by the hardware vendors that supply our industry.” The solution inside the Pixar team was to make sure that their architecture could leverage multiple kinds of computing hardware, be agnostic and not limit RenderMan. “And that idea easily translates to the cloud and I think that’s where the hardware resources of the next three or five or 10 years are going to be located, and we need to be pursuing it,” he adds.
Laur believes they are currently in the “fourth evolutionary era of the RenderMan code base, and there is zero expectation that it is the last one.” XPU on the GPU is neither the only future for RenderMan, nor is it a side project or addon. It is easy to just judge XPU by its massive speed improvement in its current solution space, but the RenderMan team is also committed to the CPU. Laur points out that the CPU-only version of RIS architecture “is really fast now compared to even what it was years ago. And so we’re not interested in just making RenderMan go faster. We’re looking forward to architecture around the platform with an ability to encompass multiple devices in the computation of one frame.”
An important and creative Look Development addition is Stylized Looks, which allow artists to move beyond physically based shading into a variety of stylized looks with camera & lighting-controlled styles. In RenderMan 24 these are included and integrated non-destructive tools to control outlines, create sketch patterns, and develop a wide range of unique looks, including Anime and hand-drawn pen effects. Importantly these are much more than either a pattern on the 3D element or complex image processing kernel as a post-process. Artists can render images that look like cartoons and illustrations, these effects are otherwise known as non-photorealistic rendering or NPR.
Far more than just producing a toon look or hand-drawn illustration impression, the Stylized Looks can allow an artist to draw attention to different aspects of the story by using the look in much the same way artists have been doing by controlling depth of field and lighting. In fact, central to the power of the Stylized Looks is the ability to control the look using lights. This produces a much more complex range of visual language than just adding a drawn-type texture or post-processing the scene with image filters.
Christos Obretenov is the founder and Shading architect at Lollipop Shaders, part of Lumnance Software that helped develop and integrate the new tools with the RenderMan team. As both a designer and developer of 3d shading plugins, as well as a rendering consultant, Obretenov has worked both with the Pixar team and other high-end feature film rendering teams. For example, he was previously commissioned to work as principal custom shader writer of non-photo real rendering tools in OSL in Arnold & Nuke for the feature film Mitchells vs The Machines at Sony Pictures Imageworks. Obretenov is a long-time friend of fxguide and trainer at fxphd.com and thus we sat down with him to discuss the new highly controllable and configurable Stylized Looks that are fully integrated into RenderMan version 24. Obretenov points to the way that the film Spider-Man: Into the Spider-verse creatively generated a lot of interest in exploring very stylized 3D animation rendering in high-end feature films.
This newer creative interpretation goes well beyond just materials but also to the light sources in the scene: “One of the things that is really cool is that you can attach one of these nodes, such as the hatching effect to a specific light source,” he explains. This allows a particular artist to push a specific Stylized Looks node through a specific light and not just any light but any aspect of the light. This means the effect can be derived from the direct diffuse, or the specular component, the refraction, the subsurface, or whatever. “It’s really robust in that way, which is exciting, because there are just so many directions an artist can go.”
The scene can be lit in a physically based rendering way, with a normal set of lights in regular RenderMan 24’s RIS path tracer, “but then the artist can say this light and this light only I’m going to push as stylized hatching, but here I will use a light mask switch, so it is only affecting where this light is directly shining,” Obretenov explains. “And then one can basically fade into the regular physically-based render in the rest of the scene, which is really cool because as you move that light around and you can really mix and match.” This gives the cinematographer or lighting artists the power to not just adjust a material property but change the render output based on how light falls and interacts with different things in the scene. “Artist can get super creative mixing and matching between physically-based rendering and original stylized looks, all in RenderMan. Old school renderers did like basic toon rendering, but this is a lot more sophisticated. It’s based on full 3D path tracing, but with mixing and matching. So now you can just affect may be something that is the refraction of a window.”
Dylan Sisson, is an Artist & Designer at Pixar, known worldwide for his advocacy of RenderMan in his role as Marketing Manager, and for his VFX lectures, creative direction, and software consultation. During COVID Sisson worked producing a set of animated shorts and product demos using the Stylized Looks, while in near-constant contact with Obretenov. This proved to be a highly valuable collaboration as Sisson pushed the creative boundaries and Obretenov worked to keep up with new artist lead innovations. The result is that the Stylized Looks are highly integrated, extremely flexible, and easy to use.
Other new features for lookdev include:
● Dispersion — The new layered materials system supports a sophisticated prismatic fringing effect for refractive objects.
● USD — hdPrman includes dynamic rendering of LPEs and AOVs in all compatible Hydra viewers, such as Houdini’s Solaris and USDView.
● Bump Roughness — An innovative system developed at Pixar Animation for rendering micro details such as scratches efficiently.
● Bloom — Add gleams and blooms directly to your live renders with this physically-based tool for RenderMan’s Image Tool (IT).
● Live Statistics — Look development is now complemented by real-time statistics, for keeping scenes efficient.
Pixar provides artist-friendly, deeply integrated RenderMan interface plug-ins for Maya, KATANA, Houdini, and Blender. Cinema4D and 3ds Max are not yet supported as integrated DCCs. The full Renderman with XPU costs $595 per floating license, providing access to either the artist interface or the batch renderer.