Augmented Reality Experience Design & Storytelling: Cues from Japan.

Japan is known as a mecca of inspiration for designers. Michael Peng, co-founder and co-managing Director of IDEO Tokyo, recently identified several places in Japan to highlight some experiences that “wow” us as designers, innovators, and human beings.

As we imagine, create, and define the future of Augmented Reality, there are excellent design cues we can draw inspiration from in the examples listed and apply them to experience design in AR.

I’ve been working with AR for the past 10 years as a designer and researcher evangelizing this new medium and focusing on storytelling in AR (my Twitter handle is @ARstories). Here are 2 of my takeaways from Peng’s list with some ideas on guiding the future of experiences and storytelling in AR.

(*If you’re interested in the future of AR, and hungry for more, this post is just a taste; read more in my upcoming book.)

1. AR and Embedded (Micro) Stories in Things
(Peng lists this as, “1. Pass the Baton – Where Stories are Currency.”)

What does storytelling look like in the augmented Internet of Things and Humans #IoTH?

Appending the word “Humans” to the term #IoT is credited to Tim O’Reilly. He stresses the complex system of interaction between humans and things, and suggests, “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves.”

4fdbb8d1ea5c7a147e79e0d1f17a5e900bf46980be0c8264595d2dbebe04bf8e(Photo by Michael Peng)

Peng highlighted the second-hand shop “Pass the Baton” in his list, and noted how the objects for sale are carefully curated stories: a photo of the previous owner and a short anecdote is included on the back of each price tag.

Peng recalled his first visit to the store and commented on how, after reading the description on the back of the price tag on a bracelet (‘I bought this when I went shopping with my mom 5 years ago. It was a special day.’), he felt an “instant connection”, increasing the object’s value.

This approach humanizes the object for sale by embedding a micro story in the object and emphasizes the human experience. It becomes the focus of the experience: a collection of human stories, objects that are seemingly ordinary made extraordinary by the tales they carry.

Peng wrote:

I left Pass-the-Baton with a full heart and a head full of ideas for how to increase the value of products and experiences through storytelling. I wanted to find ways to experiment with the notion of inherent value by applying new layers of storytelling to everyday experiences and systems.

AR makes this possible.

AR gives us a new way to think about the stories embedded in objects. AR can help make the invisible visible and breathe contextual information into objects. Further, anthropomorphism can be applied as a storytelling device to help humanize technology and our interactions with objects. We have the ability to author an augmented Internet of Things AND Humans. What stories will we tell?

2. AR and Macro Storytelling as a Way to Offer Choices
(Peng lists this as, “4. D47 – Where Provenance is Celebrated.”)

Peng described how D47, a restaurant, museum, and design travel store, showcases the best of each region in Japan. For instance, restaurant patrons are asked to choose where their food comes from rather than selecting the things they would like to eat.

d7f77818ba40982332c52358ffc24d7cdd7721bbdbe636c2d0af288aa43b0e70(Photo by Michael Peng)

Offering a macro choice, in this case organized by region, is a way to engage the user in a larger story, providing a wider field of view in content, not just in the tech (which we also want a wider field of view in).

Like the example above of the second-hand store presenting a back story to an object, here the bigger back story is the starting point for curating and organizing content and experiences. It allows for users to explore experiences through a macro-level entryway to navigate to content that they may not otherwise encounter if starting at the micro-level. It provides another way to order and navigate through stories, which we can apply to AR experiences.

Peng wrote how visiting the store encourages him to think of ways to better celebrate the history and origin of the things we create. There is certainly a trend in retail and restaurants, for example, for greater transparency and knowledge of where what we consume is coming from, a growing demand for a more immediate “history” and “origin” of things that inform our choices.

What kinds of new experiences can we design in AR if we begin storytelling at the macro level and allow users choices from such a top-level of navigation?

One of the things that tremendously excites me about AR is that it is an entirely new medium, and for now, completely unchartered terrain with no conventions. There is an incredible opportunity here to help define a new era of human computer interaction. The focus has been on the technology to date and not storytelling. We need to shift gears immediately if we are going to create something lasting and meaningful beyond a fad. We don’t want to end up with a wasteland of empty hardware paperweights that are experience-less. In tandem with the development of the technology, we need excellent storytellers, designers, and artists of all types to imagine and build these future experiences. I hope this is you, because we really need you, now.

Is it? Let’s continue the conversation on Twitter. I’m @ARstories.

*UPDATE: Purchase “Augmented Human” from:

Amazon (USA)
Amazon (Canada)
Indigo (Canada)
Barnes and Noble (USA)
O’Reilly (Digital, International)

Latest AR, VR, Wearable and Digital Tech articles & interviews

portraitsofstrength

Will 2015 Be The Year of Wearable Technology? The Toronto Star.

Will Augmented Reality Make Us Masters of the Information Age? iQ by Intel.

Portraits of Strength Feature, Tech Girls Canada.

What We Really Mean When We Text 150 Identical Emoji in a Row, Motherboard, Vice Media.

An interview with Helen Papagiannis, Augmented Reality Specialist, The Blueprint.

Wearable Tech 2015 Top Influencer

Hungry for more AR, VR, and Wearable Tech in 2015 and beyond? Head over to 40ideas.com and sign up for updates on my upcoming book.

As always, let’s continue the convo and chat on Twitter: I’m @ARstories. Happy 2015, friends!

camp_yeti_hp

Will I be seeing you soon? I need your help designing the future of AR.

talks

Very excited by the next couple of months of talks ahead on the future of Augmented Reality! This is our future to design and I hope you’ll be there to be part of the conversation.

December 9, 2014 Toronto, Ontario, Canada: Girls in Tech Toronto

December 4, 2014 Toronto, Ontario, Canada: PechaKucha IIDEX Canada (Canada’s National Design + Architecture Expo & Conference)

November 19-21, 2014 Visby, Sweden: Augmented Reality and Storytelling, Keynote address and workshop

November 13, 2014 Toronto, Ontario, Canada: Future Innovation Technology Creativity (FITC) Wearables, Co-Emcee

October 22-24, 2014 Halifax, Nova Scotia, Canada: COLLIDE Creative Technology Conference

September 8-9, 2014 Calgary, Alberta, Canada: CAMP Festival – Creative Technology, Art and Design

August 27-29, 2014 London, UK: Science and Information Conference (SAI), Keynote address

…And on the topic of designing the future of Augmented Reality, I’ll be making some announcements about my upcoming book “The 40 Ideas That Will Change Reality” very soon! Thank you dear followers for your continued support and interest in my work! I sincerely hope to see and meet you at one of my upcoming talks soon.

Very best wishes,
Helen
@ARstories

Follow ARstories on Twitter  View Helen Papagiannis's profile on LinkedIn

How to Leave Your Laptop (at Starbucks) While You Pee: Invoked Computing

Experienced this dilemma? Mark Wilson (@ctrlzee), Senior Writer at Co.Design, tweeted yesterday, “If someone designs a solution to the leave your laptop with a stranger while you pee at starbucks problem, I promise to write about it.” Augmented Reality (AR) and Invoked Computing may just have the solution.

@ctrlzee tweet

A research group at the University of Tokyo has developed a concept for AR called Invoked Computing, which can turn everyday objects into communication devices. By making a gesture invoke the device you wish to use, you can activate any ordinary object to suit your communication needs. The computer figures out what you want to do and will grant the selected object the properties of the tool you wish to utilize. A proof of concept (see video) has been created for a pizza box which functions as a laptop computer, and a banana which serves as a telephone.

Invoked Computing presents a scenario where new functions are now layered atop ordinary objects, which do not normally possess those traits. Invoked Computing is the beginning of a new era of responsive environments that are on demand, context-dependent, and needs driven. Wired writer Bruce Sterling comments on how Invoked Computing affords the possibilities for sustainability and no material footprint because you can invoke and access everything.

In my recent talk at Augmented World Expo (AWE) 2014 in Silicon Valley, following Robert Scoble‘s keynote on “The Age Of Context”, I discussed how, as both a practitioner and a PhD researcher, I’ve watched AR evolve over the past 9 years. I suggested adding two new words to the AR lexicon: overlay and entryway to describe the two distinct waves in AR I’ve observed.

Overlay is exactly as it sounds, and defines the first wave of AR as we’ve grown to known it: an overlay of digital content atop the real-world in real-time. We are now entering the second wave of AR, entryway, where the definition of AR is expanding to include things like wearables, big data, artificial intelligence, machine-learning, and social media. This second wave represents a more immersive and interactive experience that is rooted in contextual design. Invoked Computing is a prime example as it combines the overlay properties we’ve seen in the first wave of AR with an on-demand experience that is personalized to the end-user.

So, go ahead and pee; that laptop will just shift back into a pizza box when you no longer need it.

Invoked Computing is one of The 40 Ideas That Will Change Reality (the title of my upcoming book).

Let’s continue the conversation. Find me on Twitter, I’m @ARstories.

Forget AR Dinosaurs, MIT Startup Wants to Bring YOU Back from the Dead

Augmented Reality pioneer Ronald Azuma ends his 1997 seminal essay A Survey of Augmented Reality with the prediction: “Within another 25 years, we should be able to wear a pair of AR glasses outdoors to see and interact with photorealistic dinosaurs eating a tree in our backyard.” Although his prediction would take us to a few more years in 2022, AR has advanced much quicker than any of us could have imagined. With the rise of wearables and devices like Meta’s SpaceGlasses, we’re getting closer to a true AR glasses experience and we WILL get there very soon.

We’ve had AR dinosaurs already appear just about everywhere — apparently a sure-fire source of go-to content. ‘What should we make with AR? Duh, a dinosaur!’.

002472-dinosaur

Image Source: The Advertiser

Dinosaurs, shminosaurs.

How about interacting with a realistic virtual long dead you resurrected in the backyard instead? Now that might startle the neighbours.

Screen Shot 2014-02-19 at 3.59.35 PM

Image: Screenshot from Eterni.me website

MIT startup Eterni.me wants to bring you back from the dead to create a virtual avatar that acts “just like you”:

“It generates a virtual YOU, an avatar that emulates your personality and can interact with, and offer information and advice to your family and friends after you pass away. It’s like a Skype chat from the past.”

Eterni.me bares an eery resemblance to the Channel 4 Television Series Black Mirror, specifically Series 2, Episode 1, Be Right Back in which we watch widowed Martha engage with the latest technology to communicate with her recently deceased husband, Ash. Of course, it’s not actually Ash, but a simulation powered by an Artificial Intelligence (A.I.) program that gathers information about him through social media profiles and past online communications such as emails. Martha begins by chatting with virtual Ash and is able to later speak with him on the phone after uploading video files of him from which the A.I. learns his voice. Eterni.me hopes to immortalize you in a similar fashion by collecting “almost everything you create during your lifetime and processes this huge amount of information using complex A.I. Algorithms.”

blackmirror

blackmirrorbrb

Images: Black Mirror

But who will curate this mass amount of information that is “almost everything you create during your life time”? In an article on Eterni.me in Fast Company, Adele Peters writes, “While the service promises to keep everything you do online so it’s never forgotten, it’s not clear that most people would want all of that information to live forever.” Commenting on how our current generation now documents “every meal on Instagram and every thought on Twitter”, Peters asks, “What do we want to happen to that information when we’re gone?”

Will we have avatar curators?

This sentiment echoes Director Omar Naim’s 2004 film, Final Cut, starring Robin Williams. Williams plays a “cutter”, someone who has the final edit over people’s recorded histories. An embedded chip records all of your experiences over the course of your life; Williams job is to pour through all of the stored memories and produce a 1 minute video of highlights.

18856963.jpg-r_640_600-b_1_D6D6D6-f_jpg-q_x-xxyxx

Image: Film Final Cut (2004)

Will Eterni.me’s A.I. Algorithm be intelligent enough to do this and distinguish between your mundane and momentous experiences?

In Black Mirror, Martha ultimately tells simulated Ash, “You’re just a few ripples of you. There’s no history to you. You’re just a performance of stuff that he performed without thinking and it’s not enough.” Will these simulated augmentations of us be “enough”?

Marius Ursache, Eterni.me’s founder says, “In order for this to be accurate, collecting the information is not enough–people will need to interact with the avatar periodically, to help it make sense of the information, and to fine-tune it, to make it more accurate.”

This post expands on a recent article I wrote on Spike Jonze’s film Her, where I discuss the film from an AR perspective. Her introduces us to Samantha, the world’s first intelligent operating system and offers us a glimpse of our soon to be augmented life when our devices come to learn and grow with us, and in the case of Eterni.me, become us. I discuss how our smart devices, like Samantha, will come to act on our behalf. Our smart devices will know us very well, learning our behaviours, our likes, dislikes, our family and friends, even aware of our vital statistics. The next wave of AR combines elements like A.I., machine learning, sensors, and data all to tell the unique story of YOU. With Eterni.me we may just see this story of you continuing while you’re long gone.

her

Image: Spike Jonze’s film Her (2013)

Gartner claims that by 2017 your smartphone will be smarter than you. A gradual confidence will be built in the outsourcing of menial tasks to smartphones with an expectation that consumers will become more accustomed to smartphone apps and services taking control of other aspects of their lives. Gartner calls this the era of cognizant computing and identifies the four stages as: Sync Me, See Me, Know Me, Be Me. ‘Sync Me’ and ‘See Me’ are currently occurring, with ‘Know Me’ and ‘Be Me’ just ahead, as we see Samantha perform in Her. ‘Sync Me’ stores copies of your digital assets, which are kept in sync across all contexts and end points. This data storage and an archive of an ‘online you’ will be central to Eterni.me’s creation of your virtual avatar. ‘See Me’ knows where you are currently and where you have been in both the real world and on the Internet, as well as understanding your mood and context to best provide services. If your mood and context can be documented and later accessed to know how you were feeling in a particular location, this will dramatically affect the curation of your memories to be accessed by the A.I. system. ‘Know Me’ understands what you need and want, proactively and presents it to you with ‘Be Me’ as the final step where the smart device acts on your behalf based on learning. Again, being able to document and access your personal needs and wants will paint a clearer picture of the story of you and who you were. The true final step of ‘Be Me’ will be put to the test once you are six feet under, which begs the question, will we become smarter when we die?

Will you register for Eterni.me?

Let’s continue the conversation on Twitter: I’m @ARstories. And yes I’m still alive.

*Update: January 23, 2015:

Yep 2015, I’m still *still* alive, and no this isn’t a bot writing this. However, it could be. You could be receiving a beautiful hand-written note from (A.I.) me right now from the afterlife. Except I didn’t write it. A bot named BOND did using my penmanship.

More in my upcoming book on the future of reality here: www.40ideas.com

Augmented Reality, “Her”, and the Story of You

tumblr_my58njS7rs1seyhpmo1_1280Her pixel art by QuickHoney

Her is a story about people-centric technology. Spike Jonze shows us a near future where it’s all about you. This is our new Augmented Reality (AR), and it’s not science fiction.

I’ve been working with AR as a PhD researcher and designer for the past decade. The second wave of AR will surpass the current gimmickry and extend our human capacities to better understand, engage with, and experience our world in new ways. It will be human-centered and help to make our lives better. Driven by the one thing that is central and unique to AR – context – our devices will be highly cognizant of our constantly changing environments continually deciphering, translating, analyzing, and navigating to anticipate our specific needs, predicting and delivering personalized solutions with highly relevant content and experiences. Our smart devices will act on our behalf. This next wave of AR is adaptive; it is live and always on, working quietly in the background, presenting itself when necessary with the user forever at the center. It works for you, and you alone. It knows you very well, your behaviours, your likes, dislikes, your family and friends, even your vital statistics. The next wave of AR combines elements like Artificial Intelligence (A.I.), machine learning, sensors, calm computing, and data all to tell the unique story of you.

Meet Samantha, the world’s first intelligent operating system. Samantha is not real yet, only imagined in Jonze’s film Her; however, she gives us a glimpse of our soon to be augmented life when our devices come to learn and grow with us. Dr. Genevieve Bell, Director of Interaction and Experience Research at Intel, describes a world of computing where we enter a much more reciprocal relationship with technology where it begins to look after us, anticipating our needs, and doing things on our behalf. Dr. Bell’s predictions are echoed by Carolina Milanesi, Gartner’s Research Vice President. Milanesi states that by 2017, your smartphone will be smarter than you. “If there is heavy traffic, it will wake you up early for a meeting with your boss, or simply send an apology if it is a meeting with your colleague. The smartphone will gather contextual information from its calendar, its sensors, the user’s location and personal data.” Gartner’s research claims this will work with initial services being performed “automatically” to assist generally with menial tasks that are significantly time consuming such as time-bound events, like calendaring, or responding to mundane email messages. A gradual confidence will be built in the outsourcing of menial tasks to the smartphone with an expectation that consumers will become more accustomed to smartphone apps and services taking control of other aspects of their lives.

relationshipwithdevices_intelImages from Intel’s video interview with Dr. Genevieve Bell: What Will Personal Computers Be Like in 2020?

Gartner calls this the era of cognizant computing and identifies the four stages as: Sync Me, See Me, Know Me, Be Me. ‘Sync Me’ and ‘See Me’ are currently occurring, with ‘Know Me’ and ‘Be Me’ just ahead, as we see Samantha perform. ‘Sync Me’ stores copies of your digital assets, which are kept in sync across all contexts and end points. ‘See Me’ knows where you are currently and where you have been in both the real world and on the Internet, as well as understanding your mood and context to best provide services. ‘Know Me’ understands what you need and want, proactively and presents it to you with ‘Be Me’ as the final step where the smart device acts on your behalf based on learning. Samantha learns Theodore very well and with access to all of his emails, files, and other personal information, her tasks range from managing his calendar to gathering some of the love letters he ghostwrites to send them to a publisher, acting on his behalf.

Milanesi states, “Phones will become our secret digital agent, but only if we are willing to provide the information they require.” Privacy issues will certainly come into play, and a user’s level of comfort in sharing information. Dr. Bell observes that we will go beyond “an interaction” with technology to entering a trusting “relationship” with our devices. She reflects that a great deal of work goes into “getting goodness” out of our computing technology today and that we “have to tell it a tremendous amount.” She continues that in 10 years from now, our devices will know us in a very different way by being intuitive about who we are.

her-movie-2013-screenshot-samantha-pocketStill from Spike Jonze’s film Her

The world is filled with AR markers, no longer clearly distinguishable as black and white glyphs or QR code triggers; the world itself and everything in it is now one giant trackable: people, faces, emotions, voices, eye-movement, gesture, heart-rate, and more. The second wave of AR presents a brave new digital frontier, where the objects in our world are shape-shifting, invoked, and on-demand. This era will see one of new interaction design and user experiences in AR, towards natural user interfaces with heightened immediacy; we will be in the presence of the ‘thing’, more deeper immersed, yet simultaneously with both feet rooted in our physical reality. Our devices will not only get smaller and faster, and closer to, and perhaps even implanted inside our bodies, they will be smarter in how they connect with and speak to each other and multiple sensors to present a multi-modal AR experience across all devices.

Samantha is just this. She is a universal operating system that seamlessly and intelligently connects everything in her user Theodore’s world to help him be more human.

In a telephone conversation with Intel’s Futurist Brian David Johnson, he described to me how for decades our relationship with technology has been based on an input output model which has been command and control: if commands aren’t communicated correctly, or dare we have an accent, it breaks. Today, we are entering into intelligent relationships with technology. The computer knows you and how you are doing on any particular day and can deliver a personalized experience to increase your productivity. Johnson says this can, “Help us to be more human” and comments on how Samantha nurses Theodore back to having more human relationships. Johnson states that technology is just a tool: we design our tools and imbue them with our sense of humanity and our values. We can have the ability to design our machines to take care of the people we love, allowing us to extend our humanity. He calls this designing “our better angels”. Johnson says the question we need to ask is, “What are we optimizing for?” The answer needs to be to make people’s lives better, and I wholeheartedly agree.

My personal hopes for the new AR are that by entering into this more intelligent relationship with technology, we are freed to get back to human relationships and to doing what we love in the real world with real people, without our heads buried in screens. There is a whole beautiful tactile reality out there that AR can help us to explore and ‘see’ better, engaging with each other in more human ways. Get ready for a smarter, more human, and augmented you.

Let’s continue the conversation on Twitter: I’m @ARstories.

READ: “Augmented Human: How Technology is Shaping the New Reality.”