Here’s Why I Wrote Augmented Human

Excerpts from Augmented Human: How Technology Is Shaping the New Reality:

Why I Wrote This Book

Twelve years ago, I caught my first glimpse of the power of Augmented Reality (AR) as a new communication medium. It was pure magic: a virtual 3-D cube appeared in my physical surroundings and I was awestruck. The augmented cube demo wasn’t interactive at the time (it did nothing else other than appear), however, it ignited my imagination for how AR could grow and evolve. At that moment, I dedicated my creative work, research, and public speaking to the new experiences AR made possible.

I wrote this book because I began to witness a much needed shift from a focus on the technology alone to a push toward creating compelling content and meaningful experiences in AR. This book is about exploring those big ideas and the extraordinary new reality AR affords. Now is the time to dream, design, and build our wondrous future.

As AR advances, we must ask: How can we design AR experiences to enhance and make a user’s life easier and better? MIT Media Lab founder Nicholas Negroponte said, “Computing is not about computers anymore. It is about living.” AR is no longer just about the technology, it’s about living in the real world, and creating magical and meaningful experiences that are human-centred. This book is about how AR will enrich our daily lives and extend humanity in unprecedented ways.

Who Should Read This Book

It’s not too often an entire new medium emerges. You should read this book if you’re a maker, a doer, and an explorer who is excited by creating a path where there is no trail, and want to contribute to this rapidly growing industry. You should also read this book as an informed consumer for a peek at the new experiences that will change the way we live, work, and play.

You are a designer, a developer, an entrepreneur, a student, an educator, a business leader, an artist, and a technology enthusiast curious about and excited by the possibilities AR presents. You are committed to designing and supporting AR experiences for the deepest of human values to have a profound impact on bettering humanity.

augmented_human_comp_100dpiRGB

Purchase Augmented Human from:
Amazon (USA)
Amazon (Canada)
Amazon (UK)
Indigo (Canada)
Barnes and Noble (USA)
Book Depository (Worldwide)
iBooks
Google Play

Augmented Reality Top 8 of 2018

*UPDATE: VRScout just published this article here. Special thanks to Jonathan Nafarrete, Editor in Chief of VRScout.

AUGMENTED REALITY TOP 8 OF 2018 BY DR. HELEN PAPAGIANNIS

As a new communications medium, Augmented Reality (AR) is changing the way we consume and share knowledge and how we express ourselves creatively. This year AR included a range of experiences from contemporary art to currency to close-captioning. As a researcher and designer working with AR for the past 13 years, here are my picks for the Top 8 of 2018 (in no order of ranking).

These AR projects, apps, products, and tools are helping to rewrite reality and the way we engage with our immediate surroundings and world at large. Each example highlights one of three things I believe AR does really well:  Visualization, Annotation, and Storytelling.

VISUALIZATION:

AR is a powerful visualization tool. It allows you to bring an object or concept into reality that is otherwise imagined, inaccessible, or difficult to grasp and can even help to make that which is invisible now visible.

1. Big Ben Snapchat AR Lens

Snapchat is using visualization to temporarily reveal something that is currently unavailable in reality: Big Ben. The popular landmark in London is under repair until 2021 and is currently covered with scaffolding. AR is used to remove the scaffolding giving Snapchat users a peek at the restored icon.

If you look closely at the video, you’ll see the clouds are replicated and added behind the virtual tower to mask the physical scaffolding and heighten the illusion. Industry veterans like Matt Miesnieks, CEO of 6D.ai, noticed these technical details commenting on a post I shared: “The way the sky and cloud is backfilled in where there is physical scaffolding is really impressive,” said Miesnieks. “Partial diminished reality.” 

AR helps bring Big Ben back to life temporarily for the holidays with both sight and sound. A nice feature in the Snapchat Lens is the accurate time being shown on the Great Clock’s face, with bell chimes every 15 minutes and on the hour. Big Ben has been silent during the repairs and the clock hands have been removed. As one of London’s most recognizable and photographed landmarks, the restoration and maintenance has left many tourists disappointed. Now with AR you won’t have to wait until 2021 to snap that selfie: the front-facing camera also adds a virtual hat on users.  Tourists rejoice!

2. Notable Women, by former Treasurer of the United States Rosie Rios, Google Creative Lab, and Nexus Studios

The Notable Women AR app helps you discover the accomplishments of activists, artists, scientists, business leaders, writers, civic leaders and more—right on the money in your wallet. Visualization is used to see 100 historic American women where they’ve historically been left out: on U.S. currency. AR is used to “swap out the faces we know for the faces we all should,” as the Notable Women website reads. 

One of these women is Sojourner Truth who campaigned for abolition and women’s rights, becoming famous for her “Ain’t I a Woman” speech at a women’s rights convention in 1851. Named Isabella when she was born into slavery, she changed her name after gaining her freedom.

The 100 historic women were selected from the Teachers Righting History database, a collection of women whom the American people recommended to appear on actual U.S. currency during Rosie Rios’s time at the U.S. Department of Treasury. While the app is designed with teachers and their students in mind (lesson plans are available here), the content is for everyone.

3. Step Inside the Thai Cave in Augmented Reality, The New York Times

cave3

This example from The New York Times proves AR doesn’t have to be overly complicated to be effective — the use of AR was extremely powerful in its simplicity. The AR story visualized to scale the size of the claustrophobic cave openings rescuers traversed. Part of the brilliance of the visualization experience was that you could see these tunnels in reference to the size of your own body: this immediately made it personal.

“The NYT AR piece on the cave rescue was an excellent application of the medium,” said Jon Wiley, Director Google AR VR. “My son, who is about the same age as the kids that were trapped, had been hearing about this story and we talked about the challenges. The AR experience, where we could see the exact scale of the cave passage was fascinating, allowing us to see for ourselves just what a terrifying challenge it was to rescue those kids.”

Graham Roberts is the director of immersive platforms storytelling at NYT. “We add value with the AR moments by keeping them extremely simple and allowing them to do one thing well,” Roberts said. “In this case, being able to present cave slices from a survey in real scale projected into the context of your immediate environment. This is something that could not be done without this technology, and gives a more intuitive understanding of an inherently spatial story.”

ANNOTATION:

Annotation with AR helps guide you through the completion of a task, navigate a new environment, or even provide real-time descriptions of what’s happening around you.

4. Smart Caption Glasses, National Theatre, Accenture, Epson

London’s National Theatre is using AR to help make its performances more accessible for people who are deaf and hard of hearing. When wearing a pair of glasses (designed and manufactured by Epson) users see a transcript of the dialogue and descriptions of the sound from a performance displayed on the lenses in real time. 

Audience members with hearing loss previously relied on dialogue screens at the side of the auditorium (only available for a handful of performances) where you had to switch your attention from the stage to the captions on the screen. A member of the audience named Deepa who tried the smart caption glasses at the National Theatre commented, “I thought that the freedom to be able to read the captions and see the actors’ face was a big improvement. I’ve never felt that I was able to do that until I wore the glasses.”

The glasses are currently available for the productions “War Horse” and “Hadestown” and from January 2019 they will be available for booking on most new shows in the venue’s three theatres. Details on how to book the glasses are available here.

5. Bose AR

IMG_3207

Bose also wants to annotate the world around you using a pair of AR glasses (called Frames), but they want to do that with audio instead of visuals. “Bose AR represents a new kind of augmented reality — one that’s made for anyone and every day,” said John Gordon, vice president of the Consumer Electronics Division at Bose. “It places audio in your surroundings, not digital images, so you can focus on the amazing world around you — rather than a tiny display. It knows which way you’re facing, and can instantly connect that place and time with endless possibilities for travel, learning, music and more.”

This week the company shared more details on Frames and the Bose AR SDK with developers in a webinar led by Chuck Freedman, Developer Relations Lead at Bose. Three key areas were highlighted: travel, fitness, and gaming. 

Freedman noted how Frames enable you to be heads up and hands free when travelling, absorbing the sights and sounds of the city you are in with the glasses telling you what you’re looking at pointing out the significance of buildings and perhaps even having a local to virtually accompany you (in April 2018 Bose acquired Andrew Mason’s walking tour startup Detour, a company featured in my book “Augmented Human“). Bose is also working with companies on fitness coaching experiences, unlocking data not available before like detecting rep counts a person does and how long a pose is held for helping you to meet your workout goals. Freedman hinted at a coming announcement of aligning with a prominent indie game developer and also referenced a pitch competition that will launch in January.

STORYTELLING

AR makes possible new modes of storytelling and creative expression with experiences unfolding in both our homes and public spaces — it changes the way we tell, share, and even remember stories.

6. ReBlink at the Art Gallery of Ontario (AGO) by Impossible Things 

IMG_3206

The AR exhibition ReBlink was so good it was extended into 2018. Originally scheduled to end in December 2017, the exhibit was such such a hit it ran through April 2018 (and making its way into this list). Apple CEO Tim Cook enjoyed a private tour of the show in January this year while visiting Toronto. 

ReBlink blends traditional art with AR to remix classic paintings in the Art Gallery of Ontario‘s permanent collection. Using a custom app for smartphones and tablets, visitors use their device’s camera to unlock artist Alex Mayhew’s modern twists on historical works of art. One of the things I particularly loved about ReBlink was the artful conversation between the old and the new and the delightful details hidden in the reimagining of the works of art. I was surprised and impressed by the three-dimensional quality of the AR work; despite the original paintings being flat, the AR content had an incredible amount of depth as though you were peering into a diorama. The paintings came alive with AR. 

Artists have the unique ability to take the ordinary and transform it into something extraordinary, and to show us the world in a completely new way. AR does too, so AR and artists are a perfect match and that’s exemplified in ReBlink.

7. Enter The Room, The International Committee of the Red Cross (ICRC) in partnership with Nedd

Enter The Room sparks a dialogue about war; it’s an app that hopes to build empathy. Enter The Room was featured by Apple’s editorial team on the US App Store highlighting how the app “illustrates the devastating effects of war without depicting soldiers, weapons, or even the victims of violence.” The story is instead told through the room of a child and unfolds over the course of four years.

Using your iOS device, you walk through a virtual doorway that appears in your space. You find yourself in a child’s bedroom where war wages just outside the window. With AR, this room was in my home now too. I was there. And because I didn’t have the ability to control anything in the room like I might in a game, I felt helpless and afraid. I empathized with the effects of war on that child and family. 

In February I wrote an article on “Augmented Reality Storytelling: The Body and Memory Making.  I described how the emotions from an AR story stay with you long after you put your device down, or take the headset off, transforming into memories that are virtually scribed onto your environment, like an augmented palimpsest. We’re no longer just designing stories, we’re now designing memories with AR. AR stories leave a virtual imprint in our physical spaces and this is part of Enter The Room’s power.

8. Project Aero, Adobe

Project Aero is an AR authoring tool from Adobe that makes it easier for designers to create immersive content using popular tools they are already deeply familiar with such as Adobe Photoshop CC and Dimension CC. Project Aero is currently in private beta (you can request early access here). I’m thrilled to be among the first group to create AR experiences with Project Aero (you can see some of my work in my Instagram posts and stories). I love how fast and easy it is to bring ideas from my imagination to life.

My friend Ori Inbar believes 2019 will be the year of AR creators and I agree. Tools like Project Aero are a big step in helping to democratize AR as a new creative medium.

Real or AR and Augmented Human in 2018

Personal AR victories for 2018 included starting my game “Real or AR” on Instagram, which I play with audiences from around the world in my Instagram story and on global stages at my keynotes (hello to my new friends in Mexico, Finland, Japan, England, Brazil, Belgium, Portugal, Canada, and the USA!). Wagner James Au wrote an excellent piece about the game here. Follow @AugmentedHuman on Instagram and play along to help strengthen your virtual muscle for 2019 and beyond. 

Also in 2018 my book “Augmented Human” (published by O’Reilly) was translated into Korean, Chinese traditional script, and was just released in India. Mike McCready, President of the Alberta Chapter of the VR/AR Association, has started a VR Book Club and “Augmented Human” will be the first book readers will meet to discuss in VR in January 2019. Join people from 35 cities and 5 continents by RSVP’ing here.

Congrats to the AR teams listed above in the Top 8 of 2018 and to the community and industry working hard to elevate AR as a medium that extends, enriches, and enhances the world around us. 

Need help making your AR work the best of 2019? I consult with individuals and organizations at all project stages as an AR subject matter expert. Get in touch. Hello [at] augmentedstories [dot] com

Helen Papagiannis

Augmented Reality Storytelling: The Body and Memory Making

When designing AR experiences and AR stories, we too often forget something very important: the human body.

Augmented content is not two-dimensional or flat; it unfolds in our physical space, in our personal surroundings. We’re walking around it and crouching on the floor, exploring it from different angles and heights. AR stories are about our body in relation to the virtual constructions inhabiting our space as much as they are about the content presented.

Earlier this month, The New York Times debuted their first AR enabled article within their iOS app, a preview piece for the Winter Olympics in Pyeongchang, South Korea. Using an iPhone or iPad, readers can meet Olympic athletes in AR—figure skater Nathan Chen, big air snowboarder Anna Gasser, short track speed skater J.R. Celski, and hockey goalie Alex Rigsby—as if they were paused mid-performance.

John Branch, the author of the NYT AR article, observes how, despite all of the camera angles, watching sports on television creates an experience where you are “passively cocooned on the couch as a mere spectator to miniaturized athletes squeezed through a two-dimensional plane.” The NYT has readers moving and actively engaging with the content in AR, walking around the room where you’re reading the story. Far from being “cocooned on the couch”, you’re crawling on the floor to look speed skater J.R. Celski in the eye. And the 3-D visualizations are not miniaturized like on your TV; they’re sized true to life, and to scale. For example, you’re looking up at figure skater Nathan Chen as he appears 20 inches off the ground in your room, the height he would be mid quadruple jump.

The NYT gets this aspect of AR storytelling right: it’s not just about the athletes’ bodies and their form, it’s also about the way you’re maneuvering your body in conversation with the story and your space as you experience the content.

But there’s something else happening here that we also need to take into account when designing AR experiences.

I was recently chatting with Theresa Poulson about AR storytelling and my new book Augmented Human (Theresa is developing an incubator for creators to advance emerging forms of non-fiction storytelling at Video Lab West). We were discussing the NYT AR experience and Theresa said something I found very intriguing and something that I believe is often overlooked in AR. She mentioned how she walks past the place in her office daily where she encountered the 3D Olympic athlete and she remembers it as though it really happened there in that room, which it in fact did.

As I shared in my AR keynote at FutureX Live 2017 in Atlanta, we’re no longer just designing stories, we’re now designing memories with AR.

In 2016, the first experiences for the Microsoft HoloLens developer edition were introduced, including a game called Fragments, a crime drama that plays out in your physical environment and has you searching for clues in your space to solve the mystery. Kudo Tsunoda, CVP Next Gen Experiences, Windows and Devices Group, Microsoft, said, “Trust me, the first time one of our Fragments characters comes in to your home, sits down on your sofa, and strikes up a conversation with you it is an unforgettable experience.” It really is, and I especially remember the virtual rats.

“Fragments blurs the line between the digital world and the real world more than any other experience we built,” said Tsunoda. “When your living room has been used as the set for a story, it generates memories for you of what digitally happened in your space like it was real. It is an experience that bridges the uncanny valley of your mind and delivers a new form of storytelling like never before.”

There’s a higher level of emotional engagement with experiences like Fragments because the story is unique to your space, the position of your body, and your gaze. There is a direct contextual relationship with content responding to you and your environment. The way you experience Fragments in your home will be different from the way I experience it in my home. Spatial mapping and custom artificial intelligence allow a room’s layout to influence the placement of virtual content in the game, such as a piece of evidence hidden behind your furniture.

The emotions from the game stay with you long after you take the headset off, transforming into memories that are virtually scribed onto your environment, like an augmented palimpsest. And so, the importance of being cognizant of this fact and conscientious as we continue to design and develop AR stories and experiences. The audience is inviting you into their physical and mental homes. It will leave a virtual footprint. With both public and private spaces becoming stages for AR stories, let’s remember: always be generous and kind to your user. It will leave a lasting impression.

Let’s continue the conversation. I’m @ARstories on Twitter.

Augmented Reality: Beyond Conventional Time and Space

There’s a wonderful spirit of invention in the Augmented Reality (AR) community right now — the incredible experimentation and work being done with Apple’s ARkit is tremendously exciting. It’s truly thrilling to see AR accelerate both creatively and technically. Keep the demos coming!

Working with AR for 12 years now, it’s been amazing to experience AR’s evolution (read all about it and where things are headed in Augmented Human). It’s also been fun digging back into the archives recently to revisit some of my early AR prototypes from 2005 and later (especially the pre-iPhone projects and thinking about ARkit applications today).

Zach Lieberman’s ARkit experiments and process have been particularly inspiring to watch. Lieberman’s AR camera app test (video below) brought to mind artist David Hockney’s stunning and innovative photocollages — referred to as ‘joiners’ — from the 1980’s. Hockney’s joiners were a strong influence in my early AR prototypes. I was dazzled by Hockney’s approach to representing time and space, amplifying the abstraction and dynamism of Cubism, and building on the work of artists like Pablo Picasso and Georges Braque.

d1b17c5046bba561920f874e4422cebf-movement-photography-photography-portraits

Image: David Hockney, The Skater, 1984, photographic collage.

Hockney said of his photocollages, “I realized that this sort of picture came closer to how we actually see, which is to say, not all-at-once but rather in discrete, separate glimpses which we then build up into our continuous experience of the world”. Hockney presented a new way of seeing via the camera, one that mirrored the way we see in reality: through multiple glimpses that we piece together. AR, too, has the potential to rethink and present a new way of seeing and interaction with our world.

0d2d9235005e9940f6d26a5d8da98cb5

Image: David Hockney, Merced River, Yosemite Valley, Sept. 1982, photographic collage.

My AR Joiners series (2008-2009) paid homage to Hockney’s work. The AR Joiners widened Hockney’s concepts to use 2D video clips in AR with individual paper AR markers overlapping to create one larger AR collaged scene. The short video clips that composed the AR Joiners were each recorded over a series of separate moments (as opposed to one long video take that was cut into multiple fragments running on the same timeline). This was a conscious design choice: the AR Joiners were about the body moving in time, akin to Hockney’s photocollage process, with distinct moments and views, accumulating in a total memory of the space and experience across time. (Read more about the AR Joiners in a paper I presented at ISMAR 2009, Augmented Reality (AR) Joiners, A Novel Expanded Cinematic Form  published by IEEE.)

Hockney’s joiners, the AR Joiners, and the experiments we’re seeing today with ARkit  create new visual conventions beyond traditional time and space, all working toward building a novel language of AR. Another contemporary example is floatO, a photography iOS app that uses ARkit by artist Dan Monaghan.

In Architectures of the Senses: Neo-baroque Entertainment Spectacles (2003), Angela Ndalianis writes,

“The baroque’s difference from classical systems lies in the refusal to respect the limits of the frame. Instead, it intends to invade spaces in every direction, to perforate it, to become as one with all its possibilities.”

Ndalianis’s description of the baroque aligns quite nicely with Hockney’s joiners, the AR Joiners, and even Lieberman’s and Monaghan’s ARkit explorations; each of these works demonstrates ways of moving beyond the limits of the single photographic frame, expanding time in multiple directions, and puncturing conventional space.

But perforating the boundaries of reality doesn’t stop here: to truly grow the possibilities in AR, we will need to move past strictly vision-based experiments and engage the entire human sensorium with auditory, haptic, gustatory, olfactory, and visual experiences (in Augmented Human, there is a chapter dedicated to each of the senses and the opportunities with AR).

This is truly just the beginning of the dynamic, shape-shifting, and wonder-inducing new reality that is to come. I can’t wait to see, hear, touch, smell, and taste what’s next.

Let’s continue the conversation on Twitter: I’m @ARstories.

Augmented Human: How Technology is Shaping the New Reality is available from:

Amazon (USA)
Amazon (Canada)
Amazon (UK)
Indigo (Canada)
Barnes and Noble (USA)
Book Depository (Worldwide)
iBooks
Google Play

The Future of AR is… Sophisticated and Beautiful

This week, Wareable invited me to contribute to their Augmented Reality (AR) week feature. Here’s my vision for the future of AR:

“My prediction takes the form of my hopes and wishes for AR, and at its core what AR as an experience and a technology needs to be and do to truly advance.

The future of AR is sophisticated and beautiful. It enhances and is in sync with the physical world; it does not replace or supplant it. It does not overload; it aids and delights with elegance. It creates goodness, uplifting and enriching our lives. It ignites and invites curiosity and creativity. This is what we must strive for. May these new realities be deeply fulfilling and greatly benefit humanity.”

Thank you Wareable for including me and to each of the contributors for their thoughtful predictions. Read the full article here.

Last week, The Toronto Star interviewed me about the Art Gallery of Ontario’s (AGO) AR exhibit “Reblink.” I shared my thoughts on the importance of artists working with AR (which I go into more depth on in my book Augmented Human):

“Artists have the unique ability to take the ordinary and transform it into something extraordinary, and to show us the world in a completely new way. Augmented Reality does too. So AR and artists are a perfect match,” said Helen Papagiannis, an AR expert and author of Augmented Human: How Technology is Shaping the New Reality. “What’s next is an exploration of AR storytelling beyond just the visual: audio, touch, smell and taste.”

I can’t wait for you to read Augmented Human, in print September 2017. Here’s a post on why I wrote the book and who it’s for, with excerpts from the Preface.

augmented_human_comp_100dpiRGB

Purchase Augmented Human from:
Amazon (USA)
Amazon (Canada)
Amazon (UK)
Indigo (Canada)
Barnes and Noble (USA)
Book Depository (Worldwide)

Augmenting the human experience: AR, wearable tech, and the IoT

As augmented reality technologies emerge, we must place the focus on serving human needs.

Augmented reality (AR), wearable technology, and the Internet of Things (IoT) are all really about human augmentation. They are coming together to create a new reality that will forever change the way we experience the world. As these technologies emerge, we must place the focus on serving human needs.

The Internet of Things and Humans

Tim O’Reilly suggested the word “Humans” be appended to the term IoT. “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves,” wrote O’Reilly. “My point is that when you think about the Internet of Things, you should be thinking about the complex system of interaction between humans and things, and asking yourself how sensors, cloud intelligence, and actuators (which may be other humans for now) make it possible to do things differently.”

I share O’Reilly’s vision for the IoTH and propose we extend this perspective and apply it to the new AR that is emerging: let’s take the focus away from the technology and instead emphasize the human experience.

The definition of AR we have come to understand is a digital layer of information (including images, text, video, and 3D animations) viewed on top of the physical world through a smartphone, tablet, or eyewear. This definition of AR is expanding to include things like wearable technology, sensors, and artificial intelligence (AI) to interpret your surroundings and deliver a contextual experience that is meaningful and unique to you. It’s about a new sensory awareness, deeper intelligence, and heightened interaction with our world and each other.

Seeing the world in new ways

We are seeing AR pop up in all facets of life, from health, gaming, communication, and travel. Most AR applications today can be found in your pocket on mobile devices, enabling you to explore the physical world around you, untethered from your desktop computer. One of the earliest applications of AR being used to deliver a helpful experience was Word Lens. The application allows you to point your smartphone at printed text in a foreign language, including a road sign or a menu, and translate it on the fly into the language of your choice. Suddenly, you are more deeply immersed and engaged with your surroundings via a newfound contextual understanding assisted by technology.

Word Lens solves a human need. What if this same concept of using technology to augment your experience was extended to include other types of sensors, data, and networks? We are beginning to see examples of this, particularly in health care and wearable tech, with a higher goal of applying technology to help people live better lives. A perfect example of thought leaders exploring this new frontier is Rajiv Mongia, director of the Intel RealSense Interaction Design Group. Mongia and his team have developed a wearable prototype to help people with low or no vision gain a better sense of their surroundings. Combining a camera, computer vision, and sensors worn on the human body, the prototype is able to “see” objects within a few yards of you and tell you approximately where an object is located: high, low, left, or right, and whether the object is moving away or getting closer.

This is all communicated to you through vibration motors embedded into the wearable. The tactile feedback you experience is comparable to the vibration mode on your mobile phone, with the intensity corresponding to how close an object is to you. For example, if a wall or person is near you, the vibration is stronger, and if it’s farther away, it’s less intense. Mongia said that people who’ve tried the prototype say it has promise, that it augments their senses and helps them to “feel” the environment around them.

Advancing augmented reality for humanity

The Intel prototype is an example of empowering humans through technology. In developing the wearable system Mongia asked, “If we can bring vision to PCs and tablets, why not use that same technology to help people see?” This question exemplifies the spirit of the Internet of Things and Humans by giving people greater access to computer intelligence and emphasizing the human experience.

This greater goal will require seeing beyond just the technology and looking at systems of interaction to better enable and serve human needs. Tim O’Reilly has described Uber as an early IoT company. “Most people would say it is not; it’s just a pair of smartphone apps connecting a passenger and driver. But imagine for a moment the consumer end of the Uber app as it is today, and on the other end, a self-driving car. You would immediately see that as IoT.” Uber is a company that is built around location awareness. O’Reilly explained, “An Uber driver is an augmented taxi driver, with real-time location awareness. An Uber passenger is an augmented passenger, who knows when the cab will show up.”

While Uber strives to provide their users with an experience of convenience and visibility, there are other smartphone applications available today that use the reach of mobile and the power of social networking to truly help people. Be My Eyes, for example, is a mobile app that connects a person who is visually impaired with a sighted person to provide assistance. Using a live video connection, a sighted helper is able to see and describe vocally what a visually impaired person is facing. Since January 2015, more than 113,000 volunteers have signed up to help close to 10,000 visually impaired people around the world in 80 languages.

Be My Eyes is an early AR application in the same way O’Reilly described Uber as an early IoT company. Similar to Uber being more likely identified as IoT if a self-driving car was used, Be My Eyes would more likely be considered AR if a computer was using AI to identify what you were looking at. Apps like Be My Eyes are significant because they point the way to a new altruistic augmentation of reality building on the growth of the sharing economy, the power of our devices, and humans working together with computers to advance AR for humanity.

Read “Augmented Human: How Technology is Shaping the New Reality.”