As augmented reality technologies emerge, we must place the focus on serving human needs.
Augmented reality (AR), wearable technology, and the Internet of Things (IoT) are all really about human augmentation. They are coming together to create a new reality that will forever change the way we experience the world. As these technologies emerge, we must place the focus on serving human needs.
The Internet of Things and Humans
Tim O’Reilly suggested the word “Humans” be appended to the term IoT. “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves,” wrote O’Reilly. “My point is that when you think about the Internet of Things, you should be thinking about the complex system of interaction between humans and things, and asking yourself how sensors, cloud intelligence, and actuators (which may be other humans for now) make it possible to do things differently.”
I share O’Reilly’s vision for the IoTH and propose we extend this perspective and apply it to the new AR that is emerging: let’s take the focus away from the technology and instead emphasize the human experience.
The definition of AR we have come to understand is a digital layer of information (including images, text, video, and 3D animations) viewed on top of the physical world through a smartphone, tablet, or eyewear. This definition of AR is expanding to include things like wearable technology, sensors, and artificial intelligence (AI) to interpret your surroundings and deliver a contextual experience that is meaningful and unique to you. It’s about a new sensory awareness, deeper intelligence, and heightened interaction with our world and each other.
Seeing the world in new ways
We are seeing AR pop up in all facets of life, from health, gaming, communication, and travel. Most AR applications today can be found in your pocket on mobile devices, enabling you to explore the physical world around you, untethered from your desktop computer. One of the earliest applications of AR being used to deliver a helpful experience was Word Lens. The application allows you to point your smartphone at printed text in a foreign language, including a road sign or a menu, and translate it on the fly into the language of your choice. Suddenly, you are more deeply immersed and engaged with your surroundings via a newfound contextual understanding assisted by technology.
Word Lens solves a human need. What if this same concept of using technology to augment your experience was extended to include other types of sensors, data, and networks? We are beginning to see examples of this, particularly in health care and wearable tech, with a higher goal of applying technology to help people live better lives. A perfect example of thought leaders exploring this new frontier is Rajiv Mongia, director of the Intel RealSense Interaction Design Group. Mongia and his team have developed a wearable prototype to help people with low or no vision gain a better sense of their surroundings. Combining a camera, computer vision, and sensors worn on the human body, the prototype is able to “see” objects within a few yards of you and tell you approximately where an object is located: high, low, left, or right, and whether the object is moving away or getting closer.
This is all communicated to you through vibration motors embedded into the wearable. The tactile feedback you experience is comparable to the vibration mode on your mobile phone, with the intensity corresponding to how close an object is to you. For example, if a wall or person is near you, the vibration is stronger, and if it’s farther away, it’s less intense. Mongia said that people who’ve tried the prototype say it has promise, that it augments their senses and helps them to “feel” the environment around them.
Advancing augmented reality for humanity
The Intel prototype is an example of empowering humans through technology. In developing the wearable system Mongia asked, “If we can bring vision to PCs and tablets, why not use that same technology to help people see?” This question exemplifies the spirit of the Internet of Things and Humans by giving people greater access to computer intelligence and emphasizing the human experience.
This greater goal will require seeing beyond just the technology and looking at systems of interaction to better enable and serve human needs. Tim O’Reilly has described Uber as an early IoT company. “Most people would say it is not; it’s just a pair of smartphone apps connecting a passenger and driver. But imagine for a moment the consumer end of the Uber app as it is today, and on the other end, a self-driving car. You would immediately see that as IoT.” Uber is a company that is built around location awareness. O’Reilly explained, “An Uber driver is an augmented taxi driver, with real-time location awareness. An Uber passenger is an augmented passenger, who knows when the cab will show up.”
While Uber strives to provide their users with an experience of convenience and visibility, there are other smartphone applications available today that use the reach of mobile and the power of social networking to truly help people. Be My Eyes, for example, is a mobile app that connects a person who is visually impaired with a sighted person to provide assistance. Using a live video connection, a sighted helper is able to see and describe vocally what a visually impaired person is facing. Since January 2015, more than 113,000 volunteers have signed up to help close to 10,000 visually impaired people around the world in 80 languages.
Be My Eyes is an early AR application in the same way O’Reilly described Uber as an early IoT company. Similar to Uber being more likely identified as IoT if a self-driving car was used, Be My Eyes would more likely be considered AR if a computer was using AI to identify what you were looking at. Apps like Be My Eyes are significant because they point the way to a new altruistic augmentation of reality building on the growth of the sharing economy, the power of our devices, and humans working together with computers to advance AR for humanity.