After spending the last few months exploring what’s currently possible in the world of augmented reality with the latest hardware and software, we decided to pause for a moment to think about the future.
We asked ourselves the following question: “If augmented reality glasses were as light and comfortable as normal glasses, what kinds of applications would we start to see in our daily lives?”
The first ideas that popped into our minds were the following:
- Gimmicks like “Angry Birds but in your living room”.
- Widgets that you can place in the world like for example a weather widget that always sits by your front door, so you can quickly glance at it on your way out to know if you should bring an umbrella or not.
- Invasive advertising like that shown in Keiichi Matsuda’s HYPER-REALITY.
None of that is new. It was only after we introduced another variable that things got interesting. One of us asked: “What if your AR glasses could identify objects and their positions and orientations in the world? What kinds of applications would we see then?”
That’s when we all got excited. With such technology you could look at your bicycle, for example, and allow your AR glasses to summon its “user interface”. That UI would sit between the handlebars, and you could use it to see stats about your rides, or to coordinate activities with your friends.
In other words, with the ability to recognize objects, AR glasses would allow you to install apps onto physical objects, and to summon those apps effortlessly without looking at a screen.
Once we had hit upon this idea, we decided to explore it by producing concept videos for five different objects. Below you can see those videos, and notes about the principles that they exemplify under this new paradigm.
Pentax Film Camera
In this example, the user summons an app that they installed on a 50 year old analog camera. The app tells them what type of film is loaded in the camera, how many shots they have left, and when they took their first exposure.
This idea of augmenting pre-digital objects is incredibly exciting to us. It opens up new software ecosystems.
The most important thing to note here is that useful information is being provided in an unobtrusive way, and only when summoned by the user.
SodaStream Sparkling Water Maker
In this example, the AR glasses detect that the user is interacting with a SodaStream machine, and automatically display a pressure widget to make it easier to use, along with a notification about the arrival of a replacement CO2 bottle.
We believe strongly in the idea that the AR glasses should only display information when the user requests it, but this example proves to be a good exception to that rule. Automatically summoning the user interface of the SodaStream machine when the user interacts with it makes the experience seamless and convenient, but this type of behavior should still require an initial permission from the user that they only have to give once.
Guitar
In this example, the user tunes their guitar using an app that they installed on it.
This concept video was the one that people resonated with the most. Having instant access to useful tools while you are engaged with your full body in a real world activity is a magical thing.
Another aspect of this idea that we found really exciting is that the apps that you can install on a product don’t have to be made by the same company that makes the product. Your guitar might be made by Fender, but if you want to install Russ’ Super Guitar Trainer app on it, go ahead! An open ecosystem of apps like that can only benefit users.
Allbirds Running Shoes
In this example, the user summons the Allbirds running app after their shoes are identified by their AR glasses. The app shows them stats, what their friends have been up to, and tracks their run.
This is what buying an augmented product could look like. Your Allbirds would stop being just shoes. They would become a way to stay connected with your interests and the communities that exist around those interests.
Click & Grow Indoor Smart Garden
In this example, the user summons the user interface of their Click & Grow indoor smart garden. The UI tells them when their plants are ready to be harvested, and it allows them to select the next pods they would like to receive through the mail.
Here we are exploring what an object’s user interface could look like for a more mundane task like managing a subscription service. This is the sort of thing that would typically be done on a website or a mobile app. Integrating it directly onto a physical object is great, but it raises all sorts of questions about how the information should be displayed, and what type of controls should be used to interact with it.
In the Allbirds concept we used pinch-to-select and gaze tracking, while in this one we use virtual buttons backed by hard surfaces. The best controls will vary with each object.
Conclusions
A lot of people worry that in a world where AR glasses are ubiquitous, users will be constantly bombarded with all kinds of invasive and distracting virtual pop-ups and objects.
We believe that if we stick to the principles of only showing things when the user requests them, and of grounding everything we show to physical objects, then the future will be delightful. Every object you own could have a soul inside it! Some useful, some quirky, some strictly functional, but everything rich and alive.