The future of 3D creation: on the web?

When web-based 3D content creation tool was demonstrated at SIGGRAPH in Anaheim earlier this year, it certainly caused a buzz. Since then, more than 14,000 people have signed up for the product’s closed beta. To some, this might not come as much of a surprise – is made by Exocortex, already well-known for creating fluid sim tools, and its developers also contributed to VFX software tools Deadline and Krakatoa. is now heading into an open beta phase. We talk to President and Founder of Exocortex, Ben Houston, about’s WebGL development, rendering in the cloud with V-Ray, where he sees the software being most at home, and how artists can collaborate and share their web-based 3D scenes.

Above: A live WebGL scene made using Click and drag to rotate, or click the blue “Edit” button to start editing the raw scene.

fxg: What was the core reason behind developing

Houston: I think it’s always been a pain to get people to deploy software. It’s incredibly hard to install software and set up these types of pipelines in visual effects for animation. For example, 3ds Max these days is a 3 gigabyte install. You have to download that and you have to install it for 50 minutes, install licenses, et cetera. And if you’re trying to collaborate, you have to chunk up that work very discretely so that no one person is working on the same thing, which is always a challenge.

Then, even at the low-end, many of these products are roughly at around $200 a month. Because of the expense, there’s a lot of piracy that goes on – for example, we know about half of the hits for Exocortex contain the word ‘crack’ in them. Also, doing reviews if you’re in a remote location can be very costly if you have to transfer data.

Above: an introduction to polygon modeling in

And in terms of capital expenditure, many studios have to ramp up both their staff and network resources for large productions. These days they have to cluster their staff around their file server and renderfarm. If the staff are specialized you have to fly them and maintain the work project to project. But not everyone can do that.

With we’re not only trying to make it easy to install, deploy but also have virtual teams that are not in the same room, and render things on a remote basis.

fxg: is in the browser which as you say allows for virtual work, but what limitations does that also bring for the user?

Houston: You are somewhat limited by your bandwidth but we’ve tried to design it in such a way that most of the heavy data does stay on the server. Therefore we can send reduced representations. Also if you do any high quality rendering, all those high quality textures stay on the server and we only send you say low resolution jpgs.

Browsers are limited to 32 bit. Both Chrome and Firefox are 32 bit only, which means we’ve got about a gig to play with. I think that’s going to change over time. Also, everything has to be written in JavaScript which is quite different to C++.


fxg: What are some of the main features of in terms of the 3D tools?

Houston: We have a great poly-editing workflow, with true poly-meshes with any number of UV channels. Around that we’ve built a keyframe animation system. You can modify your keyframes and it plays back realtime for the most part. We’ve got a skinning and bones system. We’re looking forward to creating a rig system.

We’ve got V-Ray rendering integration that does V-Ray rendering in the cloud and it streams back to you progressively. We don’t yet have a simulation framework, but in part we designed this program because it would enable very high quality simulations to be done on the cloud, and no-one would have to set up another renderfarm to do that. Remote rendering is a great thing, but one of the problems is that you still have your data locally and there’s these complex processes to try and sync that online which can take a bit of time, whereas we just keep the data on the servers all the time so there’s no sync’ing.

fxg: Can you talk more about how rendering works with the software?

Houston: Using V-Ray through similar to turning one of the viewports in Softimage into a render region – live render regions. You can set up all your shaders interactively there while seeing nearly instantaneous previewing. We also support a pass-based multi-render system, so you can set up multiple V-Ray renders and assign them to different passes and assign different lights or settings, like separating specular from diffuse.

Above: V-Ray integration demo.

fxg: Who do you think are the users who will get the most benefit from

Houston: Because it’s new software, we don’t yet have the breadth of features of many DCC tools out there. Therefore, it will be on the lower end of usage, I think. Say, mobile games and Unity games. Maybe not the next-gen consoles but for the broader market where there are a lot of indie users. I don’t see this being used by ILM or Pixar on their next film, but it could be used by students at home who might like to create something that approaches that quality – because we can do that high quality rendering. It could be done by people who previously couldn’t afford the infrastructure to collaborate – say 10 people working from home and rendering in the cloud.

fxg: What have people been using for so far?

Houston: Most common use-case right now is poly editing. We’ve seen a lot of people make different shapes. Catherine Leung made a really nice Zelda sword and was sharing that around. And we’ve got a really nice Luxo Jnr animation that we’re going to be releasing this week.


fxg: Have you looked to your previous software experience with Exocortex in adding functionality and similar tools to future versions of

Houston: Yes, something like Bullet frameworks for physics is a low hanging fruit to do. We already have plans for CSG (Constructive Solid Geometry) frameworks within the next month. And then fluid simulation frameworks is a longer term plan too. We’re probably going to start with something like a liquid simulator.

fxg: What price can people expect to be available for?

Houston: We’re trying to keep the rate very low. We’re looking at a base subscription rate for about $10-15 a month. And then there may be a pro account which would offer more features. If you’re going to do a lot of network rendering which would require more cloud computation time, we do have to charge more because that’s one of our main costs. But if you’re just using the tool for editing, keyframing, modeling and playback – then the price point is much lower.

fxg: You’ve mentioned that is intended for lower level usage, but do you think it might also deal with some of the challenges the VFX industry is facing at the moment?

Houston: I think it can solve some of those issues, particularly the high costs for studios to start up and also for digital nomads – they move from one studio to the next. You can really pull together a virtual studio using But we do have the limitation of 32 bit and JavaScript for high-end work, so it may be best for pulling in your modeling assets or your animation rather than your core simulation or high-end TD work.

Above: find out about scene sharing and collaboration in

fxg: Where did the name come from?

Houston: We were looking at a lot of names, one of them was ‘Exocortex Studio’ but it doesn’t quite roll off the tongue. Then we noticed that Maya was a really nice name, and it’s a girl’s name. I recently had a daughter called Clara and someone in the company suggested it so we named the software after her.

fxg: Finally, what are some of the things you’re most proud of with the tool?

Houston: We have a really good scene publishing system. So if you make any animations or models you can take those and embed them into websites. And you can embed them such that you can set the licenses that allow other people to edit them if they want. One thing about I’m most proud of is that you can keep working in it and never have it pause on you. That really makes it competitive with desktop applications.

Right now creating 3D graphics is a very isolating activity and I think if you make it a more engaging, interactive thing like this, it’d just be great for everybody.

3 thoughts on “The future of 3D creation: on the web?”

  1. Pingback: Could Change the Future of the Visual Effects Industry?

  2. Pingback: Future of 3D for the web designing industry?

  3. Pingback: The future of 3D creation: on the web? | CGNCollect

Comments are closed.