Apple Glass: Consumer Tech’s Next Big Thing (Part 3 — Experiences)

--

Photo by Jesus Kiteque on Unsplash

Welcome back to Part 3 of my series on Apple Glass. Part 1 elaborates on their aesthetic design, and Part 2 discusses potential features. In Part 3, we explore new experiences and latent behaviours that can derive from such a novel consumer product.

Design (Part 1) : Click Here to Read

Feature Set (Part 2) : Click Here to Read

Experiences: Value in the Short Term, Indispensable in the Long Term

Unless Apple Glass can provide a true upgrade to the way we experience our daily lives, it’ll be a tough sell, even at the rumoured and reasonable $499. There is no room for gimmicks, but we must maintain a forward thinking approach to all the features mentioned in my previous Apple Glass articles. A feature that initially appears as a gimmick could easily be a slow rollout strategy for later generations, and for future integrations with new software and products. For example, many regard the LiDAR sensor that was added to the iPad Pro’s camera system as a gimmick, but the 3D mapping and scanning data that it captured presumably informed internal teams at Apple: teams in charge of Augmented Reality, the Apple Glass Team, and the camera team. It could be entirely strategic, while offering some tangible benefits to users. When Apple Glass is announced, let’s extrapolate on what the features could enable in terms of experiences and latent behaviours within 5 years, not upon announcement. According to Apple, “Augmented Reality transforms how you work, learn, play, shop and connect with the world around you”. I’d like to elaborate on how it can do so, by giving some examples. Here goes:

Perpetually & Contextually Informed

What store are you looking at? Surely you’re able to read that you’re staring at the Starbucks right across the street without any AR. This may be enough to you now in 2020, but the digital era full of interconnected devices comes with hyper-awareness. In fact, a few years into AR, you’d be able to effortlessly know that the Starbucks you’re looking at has a 4.2 star rating, what its exact address is, and if there’s any vacancy for you before even stepping in. Being perpetually and contextually informed can only reach its full potential once Apple Glass will have a camera. The first generation Apple Glass will likely still heavily depend on pointing the iPhone’s camera at objects in 3D space.

Trending AR VR Articles:

1. What can Vuzix Smart glasses mean for the current Android Developers?

2. How To Use the ARLOOPA App: A Step-by-Step Guide (2020)

3. Augmented reality for maintenance and repair

4. Top 10 Movies that Got Augmented Reality and Virtual Reality Righ

Action-Relevant HUD

This applies to what you’re doing at any given moment. If you’re in a workout, your Apple Glass can tune into your Apple Watch in order to display health metrics like heart rate, running distance, pace, and even a small map that utilizes the Apple Watch’s location. Apple Glass should find an effective way to let you know what it thinks may be of interest to you at that moment. This extends beyond health, and can nicely integrate “Siri Suggestions”, which relies on Machine Learning to suggest people to call, apps to open, reminders to attend to, and more.

Shared Engagement

A huge shortcoming right now is how we’re limited to a 6 inch display to perform all the actions we need, from texting and calling, to browsing social media feeds, playing games, and more. Smartphones showed us we can carry a portal to the online world in our pockets, but AR can have the digital world overlay and interact with the real world around us, for us to see and engage in. This means people can all simultaneously look at the same digital object from different point of views, located in the same coordinates, creating new levels of socialization and immersion. Right now, the closest experience we have to that can be seen in one of Apple’s WWDC Keynotes, during the Minecraft Earth AR demo in 2019. Watch it here! This still requires us to use our phone or tablets to play the game, but an AR Headset at eye level frees up hands, allowing intuitive interaction with the game or the task at hand. This can translate to several experiences besides just gaming.

Private & Public Thresholds amongst Peers

In an Augmented era, we could experience several new thresholds of privacy and blocking. Before I talk about that, if you don’t know about IKEA Place, check it out here to see it in action! Essentially, IKEA Place is an AR application that allows you to view IKEA furniture in your own environment, allowing you to mix and match and avoid having to measure and go to the store. Fast forward a few years to when AR becomes a democratized consumer electronic, and ask yourself if some home decorations could someday become solely digital entities, like artwork, photos, sculptures, and other primarily decorative accessories. It absolutely could happen! You may want to set different privacy thresholds for different people that visit your own private spaces. For your friends, you may want to display nice family photos of some family trips. For your younger kids, you may want to exhibit appropriate photos of hilarious candid times you’ve shared together. Finally, for your partner and yourself only, you may chose to place some intimate photos in your master bedroom, given the nature of this room’s privacy. Each of these scenarios require privileges in order to be seen, meaning that no one would be able to access the digital photos in your bedroom other than your partner. This calls for more intricate privacy thresholds, like whitelists, blacklists, family members only, close friends only, limited time access, and more!

Digital Behaviours have an effect on the Physical World

Another great experience to using AR is for manipulating physical objects through digital interactions. All of Apple’s new devices are being equipped with their recent Ultra-Wideband U1 chip, allowing them to understand and relay their exact locations and orientation in space, relative to one another. This could mean that in the future, Apple Glass could in theory seamlessly recognize when you’re actively gazing at a HomeKit equipped lightbulb, and will automatically pop up controls that you can interact with. We may then be able to use our voice or hand gestures to turn on or off the light, without even designating it by name. All it needs to know is that you’re looking at it and focusing, then it’s ready to receive your command. This extends beyond HomeKit, and could eventually turn into unlocking your car, authenticating transactions by gazing at a terminal, and more.

Accessibility for the Visually Impaired

Apple takes pride in their accessibility features. One of their most impressive ones allows physically debilitated people to navigate their devices like they’re Tony Stark interfacing with Jarvis! Check it out here! Apple Glass can greatly improve the experience of the blind or hard of sight. Paired with AirPods, Apple Glass can use iPhone or Apple Watch’s location data, its built-in LiDAR and ultra-wideband sensors, and in the future, cameras to recognize surroundings and relay them via audio to the blind person. In the future, Siri may tell a blind a person “Yonge Street is in 10 metres, watch out for the sidewalk ahead”.

Conclusions for Future Experiences

Managing expectations while also hoping for a radical Next Big Thing is a tough balance to strike. When it comes to the democratization of AR, companies have to be much more conservative and strategic in their rollout. When the iPhone launched in 2007, it felt like a radical new way of accessing the internet from your pocket, but these were simpler times. For AR to become widely accessible, it doesn’t solely depend on Apple. It relies on the support from other partners, the launch of apps and experiences by developers, as well as data collected towards mapping the world around us. This is why Apple launched ARKit back in 2017, allowing developers to start working on creative ways to implement AR into applications. It’s also why LiDAR technology was introduced to the iPad Pros and now the new iPhone, as they provide Apple with data that helps their engineers map physical spaces, digitally. The introduction of the Ultra-Wideband chip before Apple Glass is also another strategic decision made to ensure that once the Glasses launch, the supporting software and hardware is already out there. All this infrastructure is necessary to be built before we even get a glimpse of Apple Glass, which would allow them to launch as smoothly as they possibly can. Flashy and magical marketing in typical Apple fashion is also expected to wow us.

Cheers, and as always, keep looking forward to the future!

Don’t forget to give us your 👏 !

--

--

My goal is to inspire you about the future, and enlighten you about the technologies that will get us there.