This updated article now lives on Medium “Does AR give us a sixth sense?“ - https://link.medium.com/upEkfPnRST
Augmented Reality (AR) is the mixture of the real world and virtual worlds so that one understands the other. The medium affords a deeper connection with the real world giving context without breaking its laws. What makes AR a powerful medium is that it even supersedes our visual senses. Touch, hearing, and even smell can be technically be augmented as well.
To focus this post to a philosophical conversation about visual AR, I’d like to ease this thought experiment that human interaction with virtual objects is a new sense that we didn’t before have.
We know that when designed right, AR can break the laws of the real world. Truly immersive AR provides blended perspectives that can’t be possible anywhere else.
This digital age has gone through an evolution: Inputs for interaction with bits has spanned punching cards, striking keys on a keyboard, clicking with the mouse, and touching and gesturing with our fingers.
Mobile AR blends touching and gesturing on our devices this works well but it is a bridge to head-mounted wearables. Head-mounted wearables introduce new natural interaction input types.
Using physical objects to interact with virtual objects;
“Graspable” user interfaces in the foreground using hand-tracking and advances in occlusion;
Ambient user interfaces in the global space such as menus.
Take for example a popular AR experience from IKEA. The IKEA Place app today allows you to place furniture in your space. This saves customers from the pain of guessing how a piece of furniture, for example, fits and looks in your space. If you are interacting with this furniture, with Hololens or Magic Leap you are not using touch, to interact, you’re now a Jedi using the “force” to right-size or move a virtual couch into the corner of your room.
This “force” is a superpower we did not previously have.
The Federal Communications Commission in the U.S. recently approved radar-based gesture control technology for mobile and wearable devices.
This gesture control tech dubbed Project Soli was announced at Google I/O in 2015 and was approved by the FCC for use in the 57 to 64 GHz frequency band. It allows the wearer of a compatible device to interact without touch.
This technology is being incubated in Google’s Advanced Technology and Projects group. ATAP is a skunkworks group within Google founded by the former head of DARPA, Regina Dugan.
Now let’s think about our five senses, the faculties of sight, smell, hearing, taste, and touch. Notice something missing? The “force”. Yes, folks, humans have evolved through technology right in front of our eyes.