Omniverse streaming into the Apple Vision Pro with new NVIDIA AR XR

During today’s keynote, NVIDIA’s CEO Jensen Huang joked that the Apple Vision Pro car configurator application developed by CGI studio Katana was so realistic that he himself stepped around the virtual car door to avoid hitting it.

The ‘COATcreate’ demo, which is also being shown on the floor of GTC, featured a designer wearing the Apple Vision Pro, using a car configurator application that steams the NVIDIA Omniverse platform directly into the Vision Pro. In the presentation, the designer toggles through paint and trim options and even enters the vehicle, leveraging the power of spatial computing by blending 3D photorealistic environments with the physical world.

This is possible due to a new software framework built on Omniverse Cloud APIs, that lets developers easily send their Universal Scene Description (OpenUSD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network (GDN), a global network of graphics-ready data centers that can stream advanced 3D experiences to Apple Vision Pro. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN.

We spoke to Damian Fulmer, Global Director Of Workflow And Technology at Katana minutes after the keynote finished. “We were working with Nvidia for about two months, and they had an initial version of the car configurator running within about a week. We saw a preliminary version, pretty much, right about a week after they got the data.” Katana Studio provides creative services, virtual reality, animation, VFX, augmented reality, real-time, and WebGl services. They also provide advertising and design craft visual content services.

Katana developed an application where they took CAD files from Nissan, converted them into USD, and then used Omniverse to stream that to their cloud application. Initially, the tool worked just through a website as it was built for content creators to create content for print ads, commercials, and similar applications. “Nvidia took our car data and adapted that into their system,” Fulmer outlines. “And the way it’s being streamed to the Apple Vision Pro is the same kind of streaming we’re doing to a web app.” All the rendering is happening in the cloud, with the result being streamed to the headset and then the headset sending it’s tracking data back to the cloud. “That’s what enables the full ray tracing, and that’s what enables the full CAD data to be used,” he explains.  “What you saw on stage is actually the full CAD data that Nissan delivers to its partners to render and use for other content.  We’re not re-tessellating or otherwise baking down geometry for real-time. It’s the full set. You can go in and sit inside the car, and all the knobs have high detail – everything’s there. ”

The system works so well due to the combination of the original high-resolution CAD data being used directly and the extremely high resolution (60 fps) Apple Vision Pro display. “When you’re sitting inside, -and the steering wheel is in front of you, it’s perfectly smooth. You see all the stitching, you can look at the seats, see all the stitching,” he explains. “You can go and look at the headlights and see all the lenses and how they’re refracting because it’s actually raytracing in real-time and working the way it really would in real life.”

Spatial computing has emerged as a powerful technology for delivering immersive experiences and seamless interactions between people, products, processes and physical spaces. Industrial enterprise use cases require incredibly high-resolution displays and powerful sensors operating at high frame rates to make manufacturing experiences true to reality.

This new Omniverse-based workflow combines Apple Vision Pro groundbreaking high-resolution displays with NVIDIA’s powerful RTX cloud rendering to deliver spatial computing experiences with just the device and an internet connection. This cloud-based approach allows real-time physically based renderings to be streamed seamlessly to Apple Vision Pro, delivering high-fidelity visuals without compromising details of the massive, engineering fidelity datasets.

NVIDIA has updated its AR XR stack, and this is the first public use of that new stack. While this demo can’t be publicly released because it uses proprietary CAD data, NVIDIA is planning to release the toolkit in the next few months so that everybody can develop their own applications.

Damian Fulmer comments that “you have 60 frames a second and you have great fidelity, plus the Apple tracking is great. So everything’s locked down and it’s so integrated into the scene that’s around you that you forget that it’s actually a CG representation.”


“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple.

The Katana ‘COATcreate’ demo idea was first started shopping around and built on the idea that the same dataset is being used for offline rendering and real-time. “That’s the idea behind digital twins, and in this market, approving the data once and using it everywhere is a huge time saver,” Fulmer explains. “When you actually do the demonstration yourself, you forget that the car is not real. As Jensen said in the presentation, – but I experienced it last night with a bunch of people here at GTC, and literally every person that went in the car tried to touch the car on their way in and tried to step over things on their way out!”.

Spatial computing is set to redefine how designers and developers build engaging digital content, driving a new era of creativity and engagement. “Apple Vision Pro is the first untethered device which allows for enterprise customers to realize their work without compromise,” said Rev Lebaredian, vice president of simulation at NVIDIA. “We look forward to our customers having access to these amazing tools.” The workflow also introduces hybrid rendering, a groundbreaking technique that combines local and remote rendering on the device.

We will be publishing a lot more on the Apple Vision Pro in the next few weeks.