The goals for this WebVR project are to
- Build a WebVR app
- Make insertion of WebVR scenes DOM-friendly, meaning that you can embed a WebVR scene into a standard HTML/CSS layout, without grabbing the whole window. The latter is typical of most WebVR implementations to date.
- Explore the capabilities and limits of WebVR for creating useful and non-game VR experiences
Right now, the project can create simple VR scenes that run on your smartphone (except iOS, of course!) as well as high-end headsets like the HTC Vive.
In doing so, I reinforced my initial impression of virtual reality as an energy hog. Since the new WebVR API will allow existing web designers, Uxers, and developers to participate in the media, sustainability issues will become important. The following is some analysis I did which demonstrates just how huge the VR problem will be in a streaming virtual world. It builds off some earlier posts like this one and this one. You can also check out my presentation to the Los Angeles Ux Group.
What is Virtual Reality?
First, let’s define the virtual reality space. This actually consists of a suite of technologies which, to varying degrees, mix computer generated sights, sounds, and other senses with our normal sensory perception:
Virtual Reality – all our vision and hearing is controlled by the computer. If “Haptics” are provided, some of your sense of touch is added. This is the most popular view of VR, and is currently heavily associated with “gamers” interested in extending shooters and porn into 3D space.
To run a Head-Mounted Display (HMD), a.k.a. headset, you need a high-performance computer. Mac, even the high-end ones, don’t have powerful enough video cards to do so (though HTC Vive is experimenting with providing Mac support), and even Windows systems have to be in the $1500 range to provide smooth (read “vomit-free”) VR experiences.
This background for VR has meant that sustainability – either in energy/resource use or in design – has been lacking in the area. Mostly, it has been a adolescent male-centric pleasures of uninhibited excess.
Augmented Reality – the computer provides an interface which “overlays” information on the real world. Typically this is done on a standard computer display adding computer-generated material to webcam data. Audio is less common, but more likely in the future. Unlike VR, AR apps have existed for several years, and there is a good business case for many applications. Since they aren’t trying to create a complete artifical world, the performance requirements are less.
Apple has set its sites on this area, partly because the near-term business case is better, partly because Macs aren’t powerful enough to run HMDs like the Rift. They recently introduced their ARKit, to help iOS developers create better AR experiences.
Mixed Reality – The two categories above are just extremes. “Pure” virtual reality is an endpoint, while a basic streamed webcam image is the other. So, “mixed” reality was defined to include both categories, since real-life experiences will likely have varying degrees of computer-generated vs. “real” data at different times. Microsoft’s Hololens is a good example of a headset targeting this space. One can imagine a mix of haptics, real sensory data, and virtual information HUDs all at the same time. A great concept, but until low-cost mixed reality headsets come out later in 2017, few people have experienced it.
There are some other definitions, which add the increasing use of theater-like or movie settings, and mix them with VR. A great example is Birdman director’s Alejandro González Iñárritu virtual reality installation piece Carne y Aren. This project mixes something like dinner theater (rooms with props) with VR experiences. Users walk between the alternate environments. A good example of the “extra” is wind – real wind coming from fans above the users while they see a descending helicopter in their headsets.
These environments, due to complexity and cost, are currently dominated by filmmakers, rather than computer or game designers or developers. At the low-end, we have non-interactive 360 video, which just wraps a non-interactive video experience around you. This has been popular with filmmakers, since they don’t have to change much of what they do (yet).
Oddly Apple had a product that did exactly this 15 years ago – QuickTime VR. I bet they’re kicking themselves they dropped it!
What is WebVR?
WebVR is an API (Application Programming Interface) and W3C spec which allows a standard web browser to generate stereo images and surround audio, and, more importantly, route it either to a hardware Head-Mounted-Device (HMD) like the HTC Vive or to a mobile screen for Google Daydream and Samsung Gear VR. Using the WebGL API in browsers for several years, designers and developers can create 3D animated environments or import 360 video, and combine it with user position and orientation (from the HMD or mobile device) to create a VR, AR, or Mixed Reality experience.
Building a WebVR app has much in common with creating a media player, and in fact there are HTML5 media players like krpano that support WebVR. Game engines like Babylon.js or general WebGL 3D apps like THREE.js also support WebVR.
In Part II, I’ll consider what features of VR, AR, Mixed Reality lead to sustainability issues. Energy is one consideration, but the Ux of the designed experiences and inclusiveness also form a big part of VR sustainability.