While not limited to games, with GDC on this week, VR has moved back into the news with a range of sweeping technology, gear and project reports. From games engines to new headgear, and coming off the strong content discussions at the VR special event at Sundance last month, VR remains the hot topic for gamers, geeks and VC funding.

"VR - it feels like it is 1993 again - it is like ILM has just done Jurassic Park and the whole world  is ready for a new way of seeing things, and I think VR and interactivity is bringing that new way to story tellers and everyone is again trying to work out the new problems and seeing what can be done," EPIC Games CTO and former ILM VFX supervisor Kim Libreri told fxguide. At GDC this week Libreri is joining Alasdair Coull, Head of Research & Development at Weta Digital in showing one of the most impressive technology demos of the show - a VR immersive experience in Smaug's chamber from The Hobbit. It uses the full power of the Weta digital assets, the brand new (unveiled today) Nvidia Titan graphics card and the latest EPIC Unreal Engine (see content section below).

The Viewer

Oculus VR is showing their new "Crescent Bay" headset at GDC. This is the third major developer rig following on the heals of the DK1 and DK2 Oculus Rift kits. This was announced last year and seen earlier at CES in January.

Oculus Crescent Bay.
Oculus Crescent Bay.

The related Samsung's Gear VR headset is "powered by" the same VR technology used by Oculus VR. The rig works by having a Samsung Galaxy Note 4 handset to work it, and soon a revised version that will work with the Samsung S6 (Samsung has packed in 77 per cent more pixels into the S6 than even the Galaxy S5 -  1440 x 2560, 577 PPI).

The big difference between using say the DK2 vs. the Samsung Gear VR is head movement vs. resolution. The DK2 is really not high enough resolution in our view, it just looks pixel-ish and subtle detail is lost far too easily. The fan boys love it but for anyone from the high end effects business it is poor on image quality, however it has head tracking which is incredible. While both will follow your head turning and looking up and or around, from the DK2's sensor (sitting say on top of your laptop or desktop screen, the Rift will allow you to lean around objects.

Samsung Gear.
Samsung Gear.

We tried an unreleased demo of a new VR experience which did involve traditional edits. At one point you cut to a position behind something (but with someone out there pointing a gun) and of course you lean around to sneak a peek. Having the VR experience allow that poke around or leaning motion is a huge improvement over just being in a spot and looking around. In short if your material is say filmed or pre-rendered 360 images that are gorgeous you will want a higher res than say the DK2, and the Samsung is certainly very popular for exactly that. If you want action and movement you'll lean to the dedicated head gear with head tracking.

In both cases to work well and provide immersion in your virtual reality world you are going to want a high frame rate (at least 95 fps) as well as a low latency (plus a pixel persistence lower than 3 ms to avoid nausea when moving the head around). This is where the newly announced Sony Morpheus comes into play. Not expected until Q2 2016, the Morpheus has a new OLED screen that can update at 120 frames per second, making for images that are smoother than anything seen thus far. It also has a wider 100-degree field of view, thanks to the device's 5.7 " screen. The OLED display is 1920 x 1080, with very low persistence (much better than the 'motion blurring that often happens with older LCD screens). And the new design includes nine LED trackers to provide 360 degree tracking, according to Sony. But the important aspect about the Morpheus is that it is a console gaming VR rig. The DK2 is a PC tool right now and the Samsung is a mobile solution.

The Sony specs are impressive. By comparison, the Oculus Rift DK2 has a single 1080p panel that can refresh only at 75 FPS, and the Samsung Gear VR has a 2560 x 1440 display, but it can only go as fast as 60 FPS. But you get get your hands on a DK2 or Gear VR right now, Sony is still just a promise and a prototype.

Sony Morphesis
Sony Morpheus.

Entering into the fray is also the Valve HTC solution - announced this week. The device is called the HTC Vive, and it combines HTC's hardware with Valve's Steam VR technology. Valve is a powerful force in the non-console gaming industry, so while the device looks similar to other VR headsets, it is aimed at PC gaming. The Vive is dotted with sensors — there are over 70 according to the initial announcement. This allows for full "3D room tracking"  or what HTC calls a "Full Room Scale 360 Degree Solution with Tracked Controllers."

HTC CEO Peter Chou announcing the Vive on stage
HTC CEO Peter Chou announcing the Vive on stage.

It is believed that the HTC unit will only be 90-frames-per-second video, but with very good sound. The Morpheus has no built-in headphones by contrast.  The Vive headset deploys two 1200×1080 displays. The HTC team will also be selling "wireless VR controllers" along with the headset. The company has said publicly that it is partnering with Google, HBO, and others to make content for the device. "A Developer Edition will be available in the spring, with a Consumer Edition coming by the end of 2015," according to Arts Technica.

HTC-Vive

The gear is critical, but less often mentioned is what it is attached to, or not attached to in the case of the Samsung. There are clearly three choices - a PC (which allows for huge processing and display grunt with the right graphics card), a console with the appropriate game engine, and finally the stand alone processing of a smart phone. The last of these options might sound less attractive but Mobile is huge and the vast production scale of smart phones makes their manufacturing costs incredibly impressive.

cardboardAs of February this year at VRLA in Santa Monica, it was estimated that there were a million headsets in the field. With only a million headsets, generating a business from content is hard, even with the reduced costs of the Samsung option. But even here there is one brilliant innovation - a cardboard head set. For a little time now Google has been supporting and selling a cardboard rig that can house an iPhone or Google phone (with much less content available for the iPhone). This cardboard rig is able to use the graphics of say the iPhone's retina display to produce remarkably good VR experiences. Sure you still need to plug in your ear phones and actually hold the rig to your head - there is no strap or support. But with so little content to warrant the average consumer to buy a dedicated rig, a cardboard rig might just be the facilitator that makes the whole market explode. At VRLA one developer had huge success with a free game with hundreds of thousands of downloads - so they decided to sell their next title for $1. They made GROSS a total of $700...

Content

If selling games or content is not financially supporting content generation, what is?

One of the most successful companies in VR is Framestore - originally from the UK but now a global Oscar winning force with a dedicated VR team. Framestore will be the first to admit the market is in its infancy and that this really is very, very early days but they have had huge success with specialist projects such as their Game of Thrones Experience. Framestore worked with Relevent and HBO to create an breakthrough immersive VR experience. Utilizing Oculus Rift VR headsets and the Unity game engine, along with some wind machines and rumble packs, a user could enter the elevator in Castle Black and then ascend "The Wall. All 700ft of it. Then, at the top, you get to walk the wall and see The North - with a twist," according to a Framestore release at the time.

framestoreThey also built a 360 degree immersion Interstellar experience. Framestore's digital team recreated a large chunk of the Endurance space ship from Christopher Nolan’s film. This time it was built entirely in Unreal 4, and experienced with the Oculus Rift DK2.

Both of these projects were non-download projects, since the graphics hardware needed to run them was off the shelf but very high end, using graphics card that are much more high end than an average user would typically have, according to Framestore's Mike Woods. From their New York office, Woods has been the driving force behind much of Framestore's impressive work in this space. As founder of the Digital Department and their dedicated VR Studio, he has run their successful real time interactive ads program as well with clients such as Coca-Cola, GEICOPenguin and Beats by Dre. But Woods hopes that VR will be used in more than just advertising or trailers.  He believes we are yet to see the 'killer' VR, a definitive experience demo or breakout narrative film just yet. You can hear more with Mike Woods in next week's fxpodcast where he sits down with fxguide's Mike Seymour to discuss the industry as a whole and Framestore's work.

Narrative short films are possible, and New Deal Studios have produced an incredible war VR adventure called The Mission, filmed as live action in partnership with Jaunt VR. We covered this great live action piece on our fxphd blog. But this project was only possible due to the support of Jaunt VR who are developing an entire production pipeline from camera to VR projection for live action production.

These two projects flag a possible massive market for VR - immersive trailers. Given the need for high quality assets and production values, but the lack of viewing revenue, one can only imagine that any major Marvel or Disney franchise could be exploring VR as a great new form of downloadable immersive trailers, or rather environments. The format works as trailers and most current VR experiences are short format, and thus the file size would be downloadable, there are clearly budgets for trailers, and so a VR / film promotion could be highly engaging and popular.

For the film The Wild, a VR experience was created showing British Columbia’s raw wilderness. People can visit the remote coast and lush mountains of BC’s Great Bear Rainforest. The project provided a virtual reality experience that pairs stunning natural scenery with the Oculus Rift, to create an interactive and immersive travel experience while also promoting the film.


For many working in this space the key would not be to re-edit or just reuse the actual trailer but extend the narrative and allow for a trans-media experience where the VR informs the fan in ways beyond the basic trailer. VR requires its own set of rules and narrative tools so just showing the same footage in the round is not using this new medium to its fullest.

This is the logic Weta Digital and EPIC Games applied to their Smaug experience: "A Thief in the Shadows". One does not see the scene as it was made for the film; instead you experience being discovered, nearly crushed and finally burnt to death in Smaug's presence. The original Smaug assets were accessed and the vast complexity of the treasure room full of gold exploited, but one can now experience for yourself, rather than just watching Bilbo Baggins have all the fun.

A frame from A Thief In the Shadows...( right before you die !)
A frame from A Thief In the Shadows...( right before you die!)

The EPIC / Weta Hobbit experience is designed to take full advantage of the new Oculus Rift "Crescent Bay" headset, but also all of the Hobbit's great assets including complex sound design. Interestingly, when Park Road Post was mixing the VR experience they were faced with somewhat of a sound challenge, as the character is very much with you, and just in front of you as with a cinema screen, but roaring not from one spot but actually a huge wide (giant) dragon mouth - the sound team had to place the various frequencies of that roar in different audio spacial regions relative to the user. Smaug's dialogue is re-cut and re-imagined from the original recordings - so the voice is real, the music is also from the original and the experience is incredible. The demo’s spatial audio effects the Epic team integrated into the experience and were driven by the latest Oculus Audio SDK.

As the experience starts Smaug can't see you, you are very much in the shadows, but as he moves around he knocks over a vast pillar which narrowly misses you, only to then confront you and finally burn you to a crisp! In the VR head space Smaug's giant face comes up very close to you - until he realizes you're after the Arkenstone or Heart of the Mountain, a wondrous stone he is not best pleased to discover you are there to steal! The massive gold treasure hall, the atmospherics/fire and of course the amazing character animation of the Weta Digital team all posed their own challenges.

The Unreal 4 Engine was running on a PC with a next generation previously unannounced Nvidia Titian X card, which gave the team a performance boost compared to a regular GTX 980. Titan X, is built on the Nvidia’s Maxwell architecture, and has eight billion transistors, a 12GB framebuffer and took thousands of engineer-years to build according to its press release this morning at GDC. The new card provided more RAM, more cores and overall more performance from improved stereo driver software that allows such an incredible VR render quality. "I think on this demo compared to our last we saw a 30% performance increase," says Libreri. "So much has changed so it is not just the card, but the card is great - it has bolt loads of texture memory, so you can store some pretty amazing assets, and then Alasdair and the team came up with a very clever way of encoding the face deformations for the face which would take advantage of that memory in a way that you would not do on a normal console - but because we had these amazing cards from Nvidia - we could."

There were some custom modifications to the Unreal Engine 4, "to be able to support the quality of this performance," adds Libreri. "He has is facial animation from the movie, Weta have this amazing flesh and simulation pipeline for creatures which they use to author content and which we then basically steam live in our engine."

Weta's Alasdair Coull explained that "we tried to use our pipeline as much as we can, but we have built a lot of stuff over the last year or so for post-production and getting large numbers of real time assets on the MoCap stage (at Weta) - and we have now a lot of tools for building that content, we repurposed a lot of those tools for this VR project, it was an exploration for us to see what we could do with a lot of the real time technology we had developed."

Weta was keen to work with Epic and Coull noted that the facial animation was a particularly complex solve "as we used the technology we developed for large crowds of characters, so we could take the hero Smaug facial animation and bake it down in a way that make it run in real time. We leveraged our technology and then wrote some plugins for the Unreal Engine to see how that would work and look."

In their film pipeline Weta uses their Gazebo renderer for fast rendering and of course Manuka for the final renderings, but in moving to Unreal Coull pointed out that the "shader model in Unreal is nicely similar to where we are coming from so it was a pretty nice fit to try these things out." As Weta already has assets designed for real time preview, "it made it a natural fit," adds Libreri, to move to building a real time VR experience using a premium game engine such as Unreal.

Of course, theory is nice but the team were still trying to render high end assets from one of the most sophisticated visual effects companies in the world - at high resolution stereo in real time. "We only have 11 milli-seconds to render a stereo pair, it's a tough world getting something with 400,000 skin verts performing through this scene with lights, shadows, particles, coins fire and all sorts of crazy things!" comments Libreri.

The lighting was an interesting challenge as most game engines do not have high end area lights of the style Weta use in their film pipeline. The effects are also complex, the coin sims and the fire sims in particular. "You can't in real time do full volume fire simulations," explains Libreri, "or rigid body dynamics like you can in an offline feature pipeline."

The solutions they came up with were not unlike old school VFX from earlier in both Weta's and Libreri's careers. "Like old school visual effects from 10 years ago," says Libreri, "before we had fancy fluid solvers and volumetrics, we used to cheat a lot in the film business - to the average viewer I don't think they could tell - maybe a high end vfx person can." It certainly sounded like Weta and Epic enjoyed 'dusting' off some old techniques.

For example, the lack of area lights was solved with a cascading set of point source lights that are timed to Smaug's movement. "We had this almost theatrical lighting solution," describes Libreri, "and you know theater is not a bad analogy for VR - as we don't have cuts and it all has to work for the audience...it all has to be one continuous performance, and everything had to be re-done as one big thing that you could look at from any angle."  Coull agrees "that was one of the most difficult things to capture the moody lighting of the film."

The character itself was animated with Weta's traditional pipeline which included the skin and muscle simulations and then "what we took was some of our facial animation technology and found a way to reduce the complexity, while still capturing the same surface complexity," says Coull. "In a way that could be compressed down to a hero bake that would run on the GPUs."

Clearly, a regular games company would just not be able to produce this level of real time characters, this demo is rightly - very special. Still it is impressive that Unreal 4 can handle and scale to deal with such large data streams and complex rendering, the game engine is punching very much above the weight of any normal AAA console game to achieve this demo.

The fire was rendered using some deep comp Weta shader tricks onto cards, and combined with live action actual fire. At its peak to the viewer, the fire is 6 layers deep in the demo space. Interestingly, while the Epic team used their latest SSS algorithm (from Unreal 4.6) for the project, Smaug does not have much SSS and this presented no major challenge, but it was key to get Smaug's eyes and teeth correct.

You can currently only see Smaug in VR at GDC on the Crescent Bay
You can currently only see Smaug in VR at GDC on the Crescent Bay

The team built this with the Oculus Rift, and while this was possible due to their co-operation, the technology and the Unreal engine is relevant to any VR head gear (so long as it has a kick-arse high end PC with the world's newest top end graphics card attached to it!)

Perhaps the most remarkable aspect of the demo is that this only started in mid-December after the last Hobbit film shipped and only really got fully under way in January (after changing the whole creative concept on Jan 8th). So the whole project was done in less than two months!

The Tools

New Deal Studios with a circular 360 Jaunt VR rig
New Deal Studios shot with a circular 360 degree Jaunt VR camera rig on set of The Mission

Many key visual effects companies such as Weta, Framestore, New Deal Studios and others are exploring this space. Digital Domain, for example, produced a complex narrative scene with visual effects supervisor Janelle Croshaw that exploits the DD pipeline from effects to complex skin shaders, and the tools are starting to appear to help these effects companies extend their expertise. Croshaw, who has been at DD since 1999 and worked on films such as  Her, Curious case of Benjamin Button and and also the groundbreaking and award winning work on the Tupac hologram at Coachella, used a primarily feature film pipeline.

New Deal Studios talking at Digital VR - LAVR last month
New Deal Studios director Matthew Gratzner talking at Digital VR - LAVR last month about The Mission VR

Digital Domain's spectacular film was shown at Sundance as many traditional content and visual effects teams connected with Hollywood and independent film makers alike.

DD's beautiful imagery for Chris shown at Sundance.
DD's beautiful imagery for Chris Milk's Evolution of Verse shown at Sundance.

Chris Milk directed Evolution of Verse, his previous virtual reality film Sound and Vision played at the 2014 Sundance Film Festival, this time working with Digital Domain, film production company Annapurna Pictures and VR production company VRSE.works, to create this photo-realistic CGI-rendered 3-D virtual reality film with the digital animation and compositing in NUKE by Zachary Cole and Vinh Nguyen, but using DD's NUKE tools.

Companies such as The Foundry are working closely with an array of their existing clients to address the VR market in . The Foundry R&D team has been exploring VR for some time, having briefly shown their work at SIGGRAPH last year and with its strong flexible pipeline including OCULA disparity engine - stereo tools - it would seem VR tools in NUKE would be a natural extension.

foundry

When VR footage is filmed on set with a rig, the current solution has effectively a cluster of cameras pointing in every direction. From this it is fairly easy to imagine software stitching together a spherical map of the set. It is a bit difficult as the cameras are not nodal rotations around a point (as one does with fisheye HDR captures) - but rather a ring of cameras with a radius of the physical size of the cameras. Even a GoPro rig has 6" of offset from one side of a cluster of GoPros to the other. What is less easy to imagine is how that circular reconstructed or stitched image will be resolved to a stereo pair for each of your eyes in the head gear. In effect to get a stereo pair you have to invent a view that no two cameras ever viewed. It is stereo reconstruction at an order of difficulty above just matching two misaligned cameras in a stereo rig. This is why the ability of OCULA to produce disparity maps is such a vital technology. The Foundry will be featured in an upcoming article here at fxguide but this work of 360 degree capture extends even further than VR for the team at The Foundry into virtual set reconstruction and on set digital interactive pre-visualisation.

yellow

Another company keen to make sure its tools can work in the VR environment is the Chaos Group. They recently invested several million dollars in Nurulize Inc., a virtual reality software developer. Founder Vladimir 'Vlado' Koylazov  discussed this with fxguide two weeks ago, saying:

"We find this whole thing very interesting," says Vlado. "We have always been curious where the whole VR thing will play out. We have all watched The Matrix and thought about being in an environment and moving around, so when this opportunity came up we thought it would be really great, plus we want to make V-Ray suitable for this type of application."

Nurulize are at GDC showing off their amazing real-time 4K demo of the RISE VR experience on the Oculus DK2.  (They plan to make it a free download very soon).

Side Effects Software has been working hard to further serve the gaming/VR space as well with their announcement this week of the public beta release of Houdini Engine for UE4. This plug-in will be made available for free in early April, providing game makers with the power of Houdini’s procedural technology working inside the Unreal Editor. This latest plug-in for the Houdini Engine joins a growing list of supported apps - currently these include Unity, Autodesk Maya, (with 3DS Max and Cinema 4D coming down the road).

spacewall_ue4“Many developers have already leveraged procedural techniques in building their content,and tools like Houdini will enable even more possibilities,” says Ray Davis, Unreal Engine General Manager, Epic Games. “By combining the power of UE4 with Houdini technology developers will be able to create richer experiences - quicker than ever.”  Houdini Engine for Unreal Engine is of course being shown at the Side Effects booth [#224] at GDC this week, and will ship in April it is expected.

There is almost no serious effects vendor who is not at least exploring both games and the VR space.

Capture

While projects such as a Thief in the Shadows use 3D animation, there is also the option of filming, rigs range from basic 6 camera GoPro rigs to multiple Red EPIC cameras and Lightfield technology.

Filming panoramic high volumes environments is not new. For example, Greg Downing of xRez Studio, formerly of Rhythm & Hues and Sony Imageworks, has been doing dense environment capture for some time, and away from just the film industry. While Greg and his business partner, Eric Hanson, (formerly @ Digital Domain) have 'destroyed' New York over 6 times in their VFX careers - today they are building on their vast environment work to make incredible impressive VR experiences.

IVRPA_14_preso 419

Their skills were developed in this area from massive projects outside the film industry, for example in 2008 with The Yosemite Extreme Panoramic Imaging Project, a partnership between geologist Greg Stock, Ph.D. of the the National Park Service and xRez Studio. They set out to do so by creating an unprecedented documentation of Yosemite Valley’s granite walls by shooting 10,000 images concurrently over sixteen miles of the valley walls. Base technology used included gigapixel panoramic photography, LIDAR-based digital terrain modeling, and three-dimensional computer rendering. The result was both visual and scientific. By doing environment work before and after a rock slide they can calculate the volume of rock that fell. This is an almost impossible task after a fall.

This then extended to work with drones and glaciers, mapping year over year ice volume reduction due to global warming. Away from science they also started doing 180 degree dome experiences at many of the country's modified planetariums. It is a short jump from massive environments and dome experiences to a set of Samsung Gear VR and an incredibly impressive VR experience.

Greg with fxguide's John Montgomery (L) and VR "Futurist  at 20th Century Fox" Ted Schilowitz.
Greg Downing with fxguide's John Montgomery (L) and VR "Futurist at 20th Century Fox" Ted Schilowitz watching Greg's VR reel

You can download XRez VR demo reel here and see their work for yourself, or see a web based interactive preview - if you dont have any headgear.

XRez studios has done amazing things with portable tiny GoPro rigs, while at the other end of the camera spectrum Next VR's multiple EPIC rig shoots 6K resolution, 80-frame-per-second, with stereoscopic camera resolution that captures not just the visuals, but the 3D geometry of a location (e.g. the shape, size and distance of all the objects in the captured scene as well as the size of the environment). At GDC they have extended this to include Lightfield technology.

NextVR Rig
NextVR Rig

This technology breakthrough promises the viewer a heightened sense of immersion never before possible in live-action VR.  For example, when attending a live-action basketball game in virtual reality, the viewer should be able to look around a referee who may be standing in the way, or those at a rock concert in VR, such as Coldplay, a band the company has worked with, a user could look behind the lead singer to see the drummer.

fxguide highlighted Lightfield technology from another company as the most significant new technology at Siggraph last year. In that case the scene was rendered. NextVR has invested over three years of R&D into incorporation of light field technology with live action, and has several patents in the area of light field capture and StereoVR.

note
note the overlay of the geo and the liveaction it was derived from.

“This innovation is a required next step toward creating the ‘holodeck’,” said Dave Cole, co-founder, NextVR. “By incorporating dynamically-generated 3D geometry with ultra-high resolution stereoscopic video, we’ve created most the vivid and life-like VR experience currently possible.”

Dave Cole is speaking this week at the GDC ITA panel, "Where Will Immersive Tech be in 2 Years?" And where the industry is going is a great question. Most of the companies above are discussing VR at GDC but one of the great advances ready to break is AR, or Augmented Reality. Here either by an external camera or transparent screens in the head gear, you can see the room you are in and friends can react and see each other while also seeing a virtual character or event. AR for some holds the key to the missing social aspect that some dislike about VR. As Mike Woods from Framestore points out - it is possible to fly to Thailand and have a wonderful holiday seeing temples by yourself, but "it is just that you will have more fun if you do it with someone else."

While VR cocoons one in an isolated VR bubble, AR holds the promise of VR + Social Media with certainly less risk of tripping over your own furniture.

 


Thanks so much for reading our article.

We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.

  • Steven Antturi

    Great article. One name missing from the article is Immersive Media who is the inventor of 360 video (live action VR) and has been working with many of the companies mention to create this new format.

  • Kim Baumann Larsen

    Excellent overview of the State of VR! At the SouthWest VR conference in Bristol two weeks ago I saw a VR tech demo from the Brussels based VFX studio Nozon that blew my mind. What they offer is true interactive parallax within a confined spatial envelope in pre-rendered 360° movies. This would allow you to literally be inside a scene from a film such as Avatar or Guardians of the Galaxy with full visual fideltiy. Their tech is patent pending and to my knowledge it is currently compatible with Maya using the Arnold rendering engine.
    http://nozon.com/PresenZ-VR-Movies