AR and VR: Our Deep Wish to Make the Virtual Real

When we close our eyes at night we enter a virtual dream world. We can fly, see loved ones who’ve passed, and defy the limits of physical reality. Time, space, and our bodies are different in our dreams. Anything is possible and the rules of the physical waking world no longer apply. Our imagination reigns supreme here. In that moment, it is all real.

As humans, I believe we have a deep-seated desire to push the limits of physical reality to be able to inhabit these dreamscapes. Virtual Reality and Augmented Reality bring us closer to this dream.

We are explorers, we are inventors, we are storytellers, we are world builders. We have an innate curiosity to travel to the edge of our world, beyond our horizons, to seek and create new vistas. The power of the virtual fused with reality can help satisfy this wish. We can now step inside of our imagination and welcome others into the deep recesses of our dream worlds to share that reality.

May we imagine and build an awe-inspiring reality together.

I hope to see you at one of my talks and I would also love for you to sign-up for updates on my upcoming book. Let’s continue the conversation: I’m @ARstories on Twitter.

Upcoming Speaking Engagements

june Augmented World Expo (AWE) 2015, June 8-10, Silicon Valley, CA.
Main stage talk: “Advancing AR for Humanity.”

Solid Conference 2015, June 23-25, San Francisco, CA.
Main stage talk: “Reality Has Changed.” (Save 20% off registration at this link with my code: AFF20)

Cannes Lions: International Festival of Creativity, June 25-26, Cannes, France.
Main stage panel, Innovation Lions: “Creativity Augmented.” Helen Papagiannis, Chris Milk, and Ted Schilowitz.

Augmented Reality Experience Design & Storytelling: Cues from Japan.

Japan is known as a mecca of inspiration for designers. Michael Peng, co-founder and co-managing Director of IDEO Tokyo, recently identified several places in Japan to highlight some experiences that “wow” us as designers, innovators, and human beings.

As we imagine, create, and define the future of Augmented Reality, there are excellent design cues we can draw inspiration from in the examples listed and apply them to experience design in AR.

I’ve been working with AR for the past 10 years as a designer and researcher evangelizing this new medium and focusing on storytelling in AR (my Twitter handle is @ARstories). Here are 2 of my takeaways from Peng’s list with some ideas on guiding the future of experiences and storytelling in AR.

(*If you’re interested in the future of AR, and hungry for more, this post is just a taste; read more in my upcoming book.)

1. AR and Embedded (Micro) Stories in Things
(Peng lists this as, “1. Pass the Baton – Where Stories are Currency.”)

What does storytelling look like in the augmented Internet of Things and Humans #IoTH?

Appending the word “Humans” to the term #IoT is credited to Tim O’Reilly. He stresses the complex system of interaction between humans and things, and suggests, “This is a powerful way to think about the Internet of Things because it focuses the mind on the human experience of it, not just the things themselves.”

4fdbb8d1ea5c7a147e79e0d1f17a5e900bf46980be0c8264595d2dbebe04bf8e(Photo by Michael Peng)

Peng highlighted the second-hand shop “Pass the Baton” in his list, and noted how the objects for sale are carefully curated stories: a photo of the previous owner and a short anecdote is included on the back of each price tag.

Peng recalled his first visit to the store and commented on how, after reading the description on the back of the price tag on a bracelet (‘I bought this when I went shopping with my mom 5 years ago. It was a special day.’), he felt an “instant connection”, increasing the object’s value.

This approach humanizes the object for sale by embedding a micro story in the object and emphasizes the human experience. It becomes the focus of the experience: a collection of human stories, objects that are seemingly ordinary made extraordinary by the tales they carry.

Peng wrote:

I left Pass-the-Baton with a full heart and a head full of ideas for how to increase the value of products and experiences through storytelling. I wanted to find ways to experiment with the notion of inherent value by applying new layers of storytelling to everyday experiences and systems.

AR makes this possible.

AR gives us a new way to think about the stories embedded in objects. AR can help make the invisible visible and breathe contextual information into objects. Further, anthropomorphism can be applied as a storytelling device to help humanize technology and our interactions with objects. We have the ability to author an augmented Internet of Things AND Humans. What stories will we tell?

2. AR and Macro Storytelling as a Way to Offer Choices
(Peng lists this as, “4. D47 – Where Provenance is Celebrated.”)

Peng described how D47, a restaurant, museum, and design travel store, showcases the best of each region in Japan. For instance, restaurant patrons are asked to choose where their food comes from rather than selecting the things they would like to eat.

d7f77818ba40982332c52358ffc24d7cdd7721bbdbe636c2d0af288aa43b0e70(Photo by Michael Peng)

Offering a macro choice, in this case organized by region, is a way to engage the user in a larger story, providing a wider field of view in content, not just in the tech (which we also want a wider field of view in).

Like the example above of the second-hand store presenting a back story to an object, here the bigger back story is the starting point for curating and organizing content and experiences. It allows for users to explore experiences through a macro-level entryway to navigate to content that they may not otherwise encounter if starting at the micro-level. It provides another way to order and navigate through stories, which we can apply to AR experiences.

Peng wrote how visiting the store encourages him to think of ways to better celebrate the history and origin of the things we create. There is certainly a trend in retail and restaurants, for example, for greater transparency and knowledge of where what we consume is coming from, a growing demand for a more immediate “history” and “origin” of things that inform our choices.

What kinds of new experiences can we design in AR if we begin storytelling at the macro level and allow users choices from such a top-level of navigation?

One of the things that tremendously excites me about AR is that it is an entirely new medium, and for now, completely unchartered terrain with no conventions. There is an incredible opportunity here to help define a new era of human computer interaction. The focus has been on the technology to date and not storytelling. We need to shift gears immediately if we are going to create something lasting and meaningful beyond a fad. We don’t want to end up with a wasteland of empty hardware paperweights that are experience-less. In tandem with the development of the technology, we need excellent storytellers, designers, and artists of all types to imagine and build these future experiences. I hope this is you, because we really need you, now.

Is it? Let’s continue the conversation on Twitter. I’m @ARstories.

*UPDATE: Purchase “Augmented Human” from:

Amazon (USA)
Amazon (Canada)
Indigo (Canada)
Barnes and Noble (USA)
O’Reilly (Digital, International)

Augmented Reality and Virtual Reality: what’s the difference?

AR and VR are often confused with each other, and used interchangeably in the media, but they are significantly different. Let’s break it down:

Augmented Reality (AR): real, physical world

Virtual Reality (VR): computer generated environment, artificial world Screen Shot 2015-01-30 at 2.25.41 PM In VR, the user is completely closed off from the physical world, fully immersed in a computer generated simulation. Eg: Oculus Rift

In AR, the user is still in their physical space, now with additional digital data layered on top of the real world. Eg. Meta Spaceglasses

I often begin my presentations distinguishing between the two. Here’s one of my talks from TEDx in 2010.

The definition of AR is expanding to include things like wearable computing, new types of sensors, artificial intelligence, and machine learning. I call this the second wave of AR. Read about it here in my upcoming book.

And if you’re wondering what the heck the HoloLens is that Microsoft announced, read about it here in my latest article. *Hint, it’s not VR.

Find me on Twitter: I’m @ARstories.

Reality Has Changed. Microsoft’s HoloLens and what you need to know about the next wave of Augmented Reality

All hands on (holo)deck! 2015 is ramping up to be the year of Augmented Reality (AR). hololens Microsoft threw their hat into the ring today announcing HoloLens, their AR headset lead by Kinect inventor Alex Kipman. Remember “Fortaleza” the AR glasses we had a peek at in the leaked Xbox 720 document in 2012? Say hello to the HoloLens prototype in 2015.

The community has been quick to point out the similarities between existing AR eyewear like Meta’s SpaceGlasses, but how is HoloLens different?

HoloLens appears to use a Virtual Retinal Display (VRD).

So, what’s VRD, you ask?

VRD mirrors how the human eye works. The back of the eye receives light and converts it into signals for your brain. Images are projected directly onto the retina with the back of the eye used as a screen effectively.

The result is a more true-to-life image than the ‘ghostly transparent superimposed representation’ (as Gizmodo reporter Sean Hollister describes) we’ve seen with AR eyewear before. Hollister details his experience of Microsoft’s prototype as “standing in a room filled with objects. Posters covering the walls. And yet somehow—without blocking my vision—the HoloLens was making those objects almost totally invisible.” He states, “Some of the very shiniest things in the room—the silver handle of a pitcher, if I recall correctly—managed to reflect enough light into my eyes to penetrate the illusion.”

hololens2 In an exclusive interview with Wired’s Jessi Hempel, HoloLens’s inventor Kipman hints at VRD with his description of how HoloLens works by tricking the human brain into seeing light as matter.

“Ultimately, you know, you perceive the world because of light,” Kipman explains. “If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”

I personally can’t wait to see what my mind wants me to see, particularly in this second wave of AR. For me, AR is about extending human capacity and the human imagination, not supplanting it. I’ve been working with AR for a decade now and it’s tremendously exciting to see this all quickly becoming a reality. We have a whole new medium waiting to be defined.

Microsoft’s HoloLens is currently a prototype with no price or release date announced, and we’ve yet to see what Magic Leap will unleash into the world, but I can promise you this: AR is coming in hot and fast. We WILL experience the world in unprecedented ways. Reality has changed. Read more about the next wave of AR in my upcoming book. 40ideasweb And as always, let’s continue the conversation on Twitter: I’m @ARstories.

Latest AR, VR, Wearable and Digital Tech articles & interviews


Will 2015 Be The Year of Wearable Technology? The Toronto Star.

Will Augmented Reality Make Us Masters of the Information Age? iQ by Intel.

Portraits of Strength Feature, Tech Girls Canada.

What We Really Mean When We Text 150 Identical Emoji in a Row, Motherboard, Vice Media.

An interview with Helen Papagiannis, Augmented Reality Specialist, The Blueprint.

Wearable Tech 2015 Top Influencer

Hungry for more AR, VR, and Wearable Tech in 2015 and beyond? Head over to and sign up for updates on my upcoming book.

As always, let’s continue the convo and chat on Twitter: I’m @ARstories. Happy 2015, friends!


Will I be seeing you soon? I need your help designing the future of AR.


Very excited by the next couple of months of talks ahead on the future of Augmented Reality! This is our future to design and I hope you’ll be there to be part of the conversation.

December 9, 2014 Toronto, Ontario, Canada: Girls in Tech Toronto

December 4, 2014 Toronto, Ontario, Canada: PechaKucha IIDEX Canada (Canada’s National Design + Architecture Expo & Conference)

November 19-21, 2014 Visby, Sweden: Augmented Reality and Storytelling, Keynote address and workshop

November 13, 2014 Toronto, Ontario, Canada: Future Innovation Technology Creativity (FITC) Wearables, Co-Emcee

October 22-24, 2014 Halifax, Nova Scotia, Canada: COLLIDE Creative Technology Conference

September 8-9, 2014 Calgary, Alberta, Canada: CAMP Festival – Creative Technology, Art and Design

August 27-29, 2014 London, UK: Science and Information Conference (SAI), Keynote address

…And on the topic of designing the future of Augmented Reality, I’ll be making some announcements about my upcoming book “The 40 Ideas That Will Change Reality” very soon! Thank you dear followers for your continued support and interest in my work! I sincerely hope to see and meet you at one of my upcoming talks soon.

Very best wishes,

Follow ARstories on Twitter  View Helen Papagiannis's profile on LinkedIn

How to Leave Your Laptop (at Starbucks) While You Pee: Invoked Computing

Experienced this dilemma? Mark Wilson (@ctrlzee), Senior Writer at Co.Design, tweeted yesterday, “If someone designs a solution to the leave your laptop with a stranger while you pee at starbucks problem, I promise to write about it.” Augmented Reality (AR) and Invoked Computing may just have the solution.

@ctrlzee tweet

A research group at the University of Tokyo has developed a concept for AR called Invoked Computing, which can turn everyday objects into communication devices. By making a gesture invoke the device you wish to use, you can activate any ordinary object to suit your communication needs. The computer figures out what you want to do and will grant the selected object the properties of the tool you wish to utilize. A proof of concept (see video) has been created for a pizza box which functions as a laptop computer, and a banana which serves as a telephone.

Invoked Computing presents a scenario where new functions are now layered atop ordinary objects, which do not normally possess those traits. Invoked Computing is the beginning of a new era of responsive environments that are on demand, context-dependent, and needs driven. Wired writer Bruce Sterling comments on how Invoked Computing affords the possibilities for sustainability and no material footprint because you can invoke and access everything.

In my recent talk at Augmented World Expo (AWE) 2014 in Silicon Valley, following Robert Scoble‘s keynote on “The Age Of Context”, I discussed how, as both a practitioner and a PhD researcher, I’ve watched AR evolve over the past 9 years. I suggested adding two new words to the AR lexicon: overlay and entryway to describe the two distinct waves in AR I’ve observed.

Overlay is exactly as it sounds, and defines the first wave of AR as we’ve grown to known it: an overlay of digital content atop the real-world in real-time. We are now entering the second wave of AR, entryway, where the definition of AR is expanding to include things like wearables, big data, artificial intelligence, machine-learning, and social media. This second wave represents a more immersive and interactive experience that is rooted in contextual design. Invoked Computing is a prime example as it combines the overlay properties we’ve seen in the first wave of AR with an on-demand experience that is personalized to the end-user.

So, go ahead and pee; that laptop will just shift back into a pizza box when you no longer need it.

Invoked Computing is one of The 40 Ideas That Will Change Reality (the title of my upcoming book).

Let’s continue the conversation. Find me on Twitter, I’m @ARstories.