Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Understanding the differences between virtual reality, augmented reality and mixed reality

By Loudon Blair, Senior Director, Corporate Strategy Office, Ciena | Aug. 11, 2016
There are many exciting use cases emerging, but network infrastructure will need to evolve

Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by editors.

Virtual reality is hot, and enterprise- and consumer-facing organizations are eager to figure out how they can take advantage of the new medium, whether it be for entertainment, productivity, sales, or a myriad of other potential uses.

However, sometimes lost in all this excitement is the difference between virtual reality platforms and whether the required technical underpinnings are in place to deliver a satisfying user experience. It’s important to understand what virtual reality, augmented reality, and mixed reality are in relation to each other, as well as the technical considerations that those hoping to create experiences for these platforms need to keep in mind.

* Virtual Reality defined. Virtual reality, or VR, is often used as a blanket term for all digital-reality variations. But in practice, it’s a specific kind of experience.  While AR and MR incorporate some aspect of the real environment around the user, VR refers to a 100% virtual, simulated experience. VR headsets cover the user’s field of vision and respond to eye and head movements and shift what the screen displays accordingly, thus creating the illusion that the viewer is actually inside the other location or world.

Virtual reality is exceptionally sensitive to lag and slowdown—delays between when an input is placed and when the system reacts to it, and noticeable disruptions in the consistent stream of data being delivered, respectively. A significant portion of its value proposition includes the experience of actually being transported somewhere, and thus a frozen screen or patch of pixelated haze smashes that illusion quickly, ruining the experience for many—and in some cases causing motion sickness.

When an event is broadcast in VR—a concert, sporting event, or ceremony, for example—camera rigs that capture 360-degree (or 180-degree depending on the event) panoramic views are needed to provide the viewer with the ability to look at every angle. This requires a number of lenses and thus multiple video streams moving side-by-side. To transmit this information, copious amounts of bandwidth are required—up to 4 to 5 times as much for 360-degree video compared to regular video, according to YouTube’s Anjali Wheeler.

Further complicating bandwidth requirements is whether or not the content being streamed to the VR device is “live,” in real-time. If so, the bandwidth requirements are significantly higher.

Live VR can take two forms:  Live as in watching an event as it occurs; and “live” involving interaction with others within a virtual environment. The former is like watching an extremely immersive movie, and the unit is passively accepting the data stream from the network, which requires low latency and a high bandwidth connection to achieve high video throughput. With the latter, to enable interaction between the VR source and multiple users, also requires latency to be so low that it does not cause a noticeable delay, even as data is moved back and forth between the individual VR units connected and servers.

 

1  2  3  Next Page 

Sign up for Computerworld eNewsletters.