Augmenting the human experience: AR, wearable tech, and the IoT

As augmented reality technologies emerge, we must place the focus on serving human needs.

Augmented reality (AR), wearable technology, and the Internet of Things (IoT) are all really about human augmentation. They are coming together to create a new reality that will forever change the way we experience the world. As these technologies emerge, we must place the focus on serving human needs.

The Internet of Things and Humans

Tim O’Reilly suggested the word “Humans” be appended to the term IoT. “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves,” wrote O’Reilly. “My point is that when you think about the Internet of Things, you should be thinking about the complex system of interaction between humans and things, and asking yourself how sensors, cloud intelligence, and actuators (which may be other humans for now) make it possible to do things differently.”

I share O’Reilly’s vision for the IoTH and propose we extend this perspective and apply it to the new AR that is emerging: let’s take the focus away from the technology and instead emphasize the human experience.

The definition of AR we have come to understand is a digital layer of information (including images, text, video, and 3D animations) viewed on top of the physical world through a smartphone, tablet, or eyewear. This definition of AR is expanding to include things like wearable technology, sensors, and artificial intelligence (AI) to interpret your surroundings and deliver a contextual experience that is meaningful and unique to you. It’s about a new sensory awareness, deeper intelligence, and heightened interaction with our world and each other.

Seeing the world in new ways

We are seeing AR pop up in all facets of life, from health, gaming, communication, and travel. Most AR applications today can be found in your pocket on mobile devices, enabling you to explore the physical world around you, untethered from your desktop computer. One of the earliest applications of AR being used to deliver a helpful experience was Word Lens. The application allows you to point your smartphone at printed text in a foreign language, including a road sign or a menu, and translate it on the fly into the language of your choice. Suddenly, you are more deeply immersed and engaged with your surroundings via a newfound contextual understanding assisted by technology.

Word Lens solves a human need. What if this same concept of using technology to augment your experience was extended to include other types of sensors, data, and networks? We are beginning to see examples of this, particularly in health care and wearable tech, with a higher goal of applying technology to help people live better lives. A perfect example of thought leaders exploring this new frontier is Rajiv Mongia, director of the Intel RealSense Interaction Design Group. Mongia and his team have developed a wearable prototype to help people with low or no vision gain a better sense of their surroundings. Combining a camera, computer vision, and sensors worn on the human body, the prototype is able to “see” objects within a few yards of you and tell you approximately where an object is located: high, low, left, or right, and whether the object is moving away or getting closer.

This is all communicated to you through vibration motors embedded into the wearable. The tactile feedback you experience is comparable to the vibration mode on your mobile phone, with the intensity corresponding to how close an object is to you. For example, if a wall or person is near you, the vibration is stronger, and if it’s farther away, it’s less intense. Mongia said that people who’ve tried the prototype say it has promise, that it augments their senses and helps them to “feel” the environment around them.

Advancing augmented reality for humanity

The Intel prototype is an example of empowering humans through technology. In developing the wearable system Mongia asked, “If we can bring vision to PCs and tablets, why not use that same technology to help people see?” This question exemplifies the spirit of the Internet of Things and Humans by giving people greater access to computer intelligence and emphasizing the human experience.

This greater goal will require seeing beyond just the technology and looking at systems of interaction to better enable and serve human needs. Tim O’Reilly has described Uber as an early IoT company. “Most people would say it is not; it’s just a pair of smartphone apps connecting a passenger and driver. But imagine for a moment the consumer end of the Uber app as it is today, and on the other end, a self-driving car. You would immediately see that as IoT.” Uber is a company that is built around location awareness. O’Reilly explained, “An Uber driver is an augmented taxi driver, with real-time location awareness. An Uber passenger is an augmented passenger, who knows when the cab will show up.”

While Uber strives to provide their users with an experience of convenience and visibility, there are other smartphone applications available today that use the reach of mobile and the power of social networking to truly help people. Be My Eyes, for example, is a mobile app that connects a person who is visually impaired with a sighted person to provide assistance. Using a live video connection, a sighted helper is able to see and describe vocally what a visually impaired person is facing. Since January 2015, more than 113,000 volunteers have signed up to help close to 10,000 visually impaired people around the world in 80 languages.

Be My Eyes is an early AR application in the same way O’Reilly described Uber as an early IoT company. Similar to Uber being more likely identified as IoT if a self-driving car was used, Be My Eyes would more likely be considered AR if a computer was using AI to identify what you were looking at. Apps like Be My Eyes are significant because they point the way to a new altruistic augmentation of reality building on the growth of the sharing economy, the power of our devices, and humans working together with computers to advance AR for humanity.

The state of Augmented Reality

A look at AR today and how we need to design it for tomorrow.

Unlike virtual reality (VR), augmented reality (AR) provides a gateway to a new dimension without the need to leave our physical world behind. We still see the real world around us in AR, whereas in VR, the real world is completely blocked out and replaced by a new world that immerses the user in a computer generated environment.

AR Today

The most common definition of AR to date is a digital overlay on top of the real world, consisting of computer graphics, text, video, and audio, which is interactive in real time. This is experienced through a smartphone, tablet, computer, or AR eyewear equipped with software and a camera. Examples of AR today include the translation of signs or menus into the language of your choice, pointing at and identifying stars and planets in the night sky, and delving deeper into a museum exhibit with an interactive AR guide. AR presents the opportunity to better understand and experience our world in unprecedented ways.

AR is rapidly gaining momentum (and extreme amounts of funding) with great advances and opportunities in science, design, and business. It is not often that a whole new communications medium is introduced to the world. AR will have a profound effect on the way we live, work, and play. Now is the time to imagine, design, and build our virtual future.

Working with AR for a decade as a Ph.D. researcher, designer, and technology evangelist, I’ve watched AR evolve in regard to both technology (software and hardware) and experience design. An AR experience is commonly triggered by tracking something in the physical environment that activates the AR content. Images, GPS locations, and the human body and face are all things that can be tracked to initiate an AR experience, with more complex things like emotion and voice expanding this list. We are seeing a rise in AR hardware, with a particular emphasis on digital eyewear that includes gesture interaction from companies like Magic Leap and Microsoft with the recently announced HoloLens headset.

Designing AR for tomorrow

We are at a moment where we are also seeing a shift from AR as a layer on top of reality to a more immersive contextual experience that combines things like wearable computing, machine learning, and the Internet of Things (IoT). We are moving beyond an experience of holding up our smartphones and seeing three-dimensional animations like dinosaurs appear to examples of assistive technology that help the blind to see and navigate their surroundings. AR is life changing, and there is extreme potential here to design experiences that surpass gimmickry and have a positive effect on humanity.

MIT Media Lab founder Nicholas Negroponte said, “Computing is not about computers anymore. It is about living.” AR, too, is no longer about technology; it’s about defining how we want to live in the real world with this new technology and how we will design experiences that are meaningful and help advance humanity. There is an immediate need for storytellers and designers of all types to aid in defining AR’s trajectory. The technology exists, now it’s about authoring compelling content and applying meaningful experiences in this new medium.

Keeping it human-centered

For me, it’s about maintaining our humanness in a sea of limitless options within this new medium. We must think critically about how we will place human experience at the center. It’s not about being lost in our devices; it’s about technology receding into the background so that we can engage in human moments.

An article in Forbes by John Hagel and John Seely Brown looked at how IoT can help to enhance human relationships. Hagel and Brown described a scenario (that can be powered with current technology) of “data-augmented human assistance,” where a primary care physician wearing digital eyewear interacts with a patient to listen attentively and maintain eye contact while accessing and documenting relevant data. With the process of data capture and information transfer offloaded into the background, such devices can be applied to improve human relationships. “Practitioners can use technology to get technology out of the way — to move data and information flows to the side and enable better human interaction,” wrote Hagel and Brown, noting how such examples highlight a paradox that is inherent in the IoT: “although technology aims to weave data streams without human intervention, its deeper value comes from connecting people.”

This new wave of AR that combines IoT, big data, and wearable computing also has an incredible opportunity to connect people and create meaningful experiences, whether it’s across distances or being face to face with someone. The future of these new experiences is for us to imagine and build. Reality will be augmented in never-before-seen ways. What do you want it to look like and what role will you play in defining it?

Announcing: “Augmented Human: How Technology is Shaping the New Reality”

bookauthorimagesAugmented Human

by Helen Papagiannis, Published by O’Reilly

I’m SUPER excited to announce my book, Augmented Human: How Technology is Shaping the New Reality.

You may remember the book being titled, “The 40 Ideas That Will Change Reality”… I’m thrilled to share it’s now morphed into something even BIGGER and is being published by O’Reilly. Pre-orders are available now at this link.

The book looks at how Augmented Reality and this next major technological shift will forever change the way we experience the world. By inspiring design for the best of humanity and the best of technology, Augmented Human is essential reading for designers, technologists, entrepreneurs, business leaders, and anyone who desires a peek at our virtual future.

Father Of Wearable Computing, Dr. Steve Mann, and the Grandfather of AR/VR, Dr. Tom Furness are two incredible humans who have dedicated their careers to making a positive impact upon humanity with emerging technologies and new inventions. I’m incredibly honoured and grateful to have Dr. Furness and Dr. Mann each contributing a foreword to my book.

An early release of the book will be available soon featuring chapters you can read immediately upon pre-order. More news to come on the book in the coming weeks! I truly can’t wait for you to read Augmented Human!

Also check out the first of many articles I’m writing for O’Reilly Radar on the current state of Augmented Reality and designing for the future.

I’m writing this from Augmented World Expo in Silicon Valley where I’m giving a talk on Advancing AR for Humanity on June 10 at 11:30am on the main stage. I hope to see you at one of my upcoming talks in June:

~ Solid Conference in San Francisco, Jun 23-25
(Save 20% off registration at this link with my code: AFF20)

Cannes Lions International Festival of Creativity in France, June 25-26

Thanks for your continued support!
All my best,
~ Helen, @ARstories.

AR and VR: Our Deep Wish to Make the Virtual Real

When we close our eyes at night we enter a virtual dream world. We can fly, see loved ones who’ve passed, and defy the limits of physical reality. Time, space, and our bodies are different in our dreams. Anything is possible and the rules of the physical waking world no longer apply. Our imagination reigns supreme here. In that moment, it is all real.

As humans, I believe we have a deep-seated desire to push the limits of physical reality to be able to inhabit these dreamscapes. Virtual Reality and Augmented Reality bring us closer to this dream.

We are explorers, we are inventors, we are storytellers, we are world builders. We have an innate curiosity to travel to the edge of our world, beyond our horizons, to seek and create new vistas. The power of the virtual fused with reality can help satisfy this wish. We can now step inside of our imagination and welcome others into the deep recesses of our dream worlds to share that reality.

May we imagine and build an awe-inspiring reality together.

I hope to see you at one of my talks and I would also love for you to sign-up for updates on my upcoming book The 40 Ideas That Will Change Reality. Let’s continue the conversation: I’m @ARstories on Twitter.

Upcoming Speaking Engagements

june Augmented World Expo (AWE) 2015, June 8-10, Silicon Valley, CA.
Main stage talk: “Advancing AR for Humanity.”

Solid Conference 2015, June 23-25, San Francisco, CA.
Main stage talk: “Reality Has Changed.” (Save 20% off registration at this link with my code: AFF20)

Cannes Lions: International Festival of Creativity, June 25-26, Cannes, France.
Main stage panel, Innovation Lions: “Creativity Augmented.” Helen Papagiannis, Chris Milk, and Ted Schilowitz.

Augmented Reality Experience Design & Storytelling: Cues from Japan.

Japan is known as a mecca of inspiration for designers. Michael Peng, co-founder and co-managing Director of IDEO Tokyo, recently identified several places in Japan to highlight some experiences that “wow” us as designers, innovators, and human beings.

As we imagine, create, and define the future of Augmented Reality, there are excellent design cues we can draw inspiration from in the examples listed and apply them to experience design in AR.

I’ve been working with AR for the past 10 years as a designer and researcher evangelizing this new medium and focusing on storytelling in AR (my Twitter handle is @ARstories). Here are 2 of my takeaways from Peng’s list with some ideas on guiding the future of experiences and storytelling in AR.

(*If you’re interested in the future of AR, and hungry for more, this post is just a taste; read more in my upcoming book.)

1. AR and Embedded (Micro) Stories in Things
(Peng lists this as, “1. Pass the Baton – Where Stories are Currency.”)

What does storytelling look like in the augmented Internet of Things and Humans #IoTH?

Appending the word “Humans” to the term #IoT is credited to Tim O’Reilly. He stresses the complex system of interaction between humans and things, and suggests, “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves.”

4fdbb8d1ea5c7a147e79e0d1f17a5e900bf46980be0c8264595d2dbebe04bf8e(Photo by Michael Peng)

Peng highlighted the second-hand shop “Pass the Baton” in his list, and noted how the objects for sale are carefully curated stories: a photo of the previous owner and a short anecdote is included on the back of each price tag.

Peng recalled his first visit to the store and commented on how, after reading the description on the back of the price tag on a bracelet (‘I bought this when I went shopping with my mom 5 years ago. It was a special day.’), he felt an “instant connection”, increasing the object’s value.

This approach humanizes the object for sale by embedding a micro story in the object and emphasizes the human experience. It becomes the focus of the experience: a collection of human stories, objects that are seemingly ordinary made extraordinary by the tales they carry.

Peng wrote:

I left Pass-the-Baton with a full heart and a head full of ideas for how to increase the value of products and experiences through storytelling. I wanted to find ways to experiment with the notion of inherent value by applying new layers of storytelling to everyday experiences and systems.

AR makes this possible.

AR gives us a new way to think about the stories embedded in objects. AR can help make the invisible visible and breathe contextual information into objects. Further, anthropomorphism can be applied as a storytelling device to help humanize technology and our interactions with objects. We have the ability to author an augmented Internet of Things AND Humans. What stories will we tell?

2. AR and Macro Storytelling as a Way to Offer Choices
(Peng lists this as, “4. D47 – Where Provenance is Celebrated.”)

Peng described how D47, a restaurant, museum, and design travel store, showcases the best of each region in Japan. For instance, restaurant patrons are asked to choose where their food comes from rather than selecting the things they would like to eat.

d7f77818ba40982332c52358ffc24d7cdd7721bbdbe636c2d0af288aa43b0e70(Photo by Michael Peng)

Offering a macro choice, in this case organized by region, is a way to engage the user in a larger story, providing a wider field of view in content, not just in the tech (which we also want a wider field of view in).

Like the example above of the second-hand store presenting a back story to an object, here the bigger back story is the starting point for curating and organizing content and experiences. It allows for users to explore experiences through a macro-level entryway to navigate to content that they may not otherwise encounter if starting at the micro-level. It provides another way to order and navigate through stories, which we can apply to AR experiences.

Peng wrote how visiting the store encourages him to think of ways to better celebrate the history and origin of the things we create. There is certainly a trend in retail and restaurants, for example, for greater transparency and knowledge of where what we consume is coming from, a growing demand for a more immediate “history” and “origin” of things that inform our choices.

What kinds of new experiences can we design in AR if we begin storytelling at the macro level and allow users choices from such a top-level of navigation?

One of the things that tremendously excites me about AR is that it is an entirely new medium, and for now, completely unchartered terrain with no conventions. There is an incredible opportunity here to help define a new era of human computer interaction. The focus has been on the technology to date and not storytelling. We need to shift gears immediately if we are going to create something lasting and meaningful beyond a fad. We don’t want to end up with a wasteland of empty hardware paperweights that are experience-less. In tandem with the development of the technology, we need excellent storytellers, designers, and artists of all types to imagine and build these future experiences. I hope this is you, because we really need you, now.

Is it? Let’s continue the conversation on Twitter. I’m @ARstories.

Designing the Future of Augmented Reality: slides from PechaKucha

PechaKucha3

Sometimes you just need to relinquish control.

20 slides, 20 seconds each, and no control of the slide clicker. GO!

I had a fantastic experience at PechaKucha Toronto in December. My presentation, “Designing the Future of Augmented Reality” is now posted here.

More on the wildly fun International PechaKucha format here.

I have some exciting announcements on upcoming talks and my book to share very soon! Hope to see you at one of these events!

You can also find me on Twitter: I’m @ARstories.

*Update: Feb 24/15: Flattered to be featured as the Presentation of the Day on the PechaKucha website!

Screen Shot 2015-02-24 at 12.35.46 PM

Augmented Reality and Virtual Reality: what’s the difference?

AR and VR are often confused with each other, and used interchangeably in the media, but they are significantly different. Let’s break it down:

Augmented Reality (AR): real, physical world

Virtual Reality (VR): computer generated environment, artificial world Screen Shot 2015-01-30 at 2.25.41 PM In VR, the user is completely closed off from the physical world, fully immersed in a computer generated simulation. Eg: Oculus Rift

In AR, the user is still in their physical space, now with additional digital data layered on top of the real world. Eg. Meta Spaceglasses

I often begin my presentations distinguishing between the two. Here’s one of my talks from TEDx in 2010.

The definition of AR is expanding to include things like wearable computing, new types of sensors, artificial intelligence, and machine learning. I call this the second wave of AR. Read about it here in my upcoming book.

And if you’re wondering what the heck the HoloLens is that Microsoft announced, read about it here in my latest article. *Hint, it’s not VR.

Find me on Twitter: I’m @ARstories.