BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How Spatial Audio Can Change Everything For The Blind

Following
This article is more than 3 years old.

Apple hosted their annual developer conference WWDC this past week (6-22-2020), and it was a much different experience than we are used to. We’ve written quite a bit about rumored AR glasses coming from Apple, possibly two different types, including one see-through option that might challenge the established optics industry. It’s rumored to help those who are partially sighted. “That would change my life,” says Bradley Steinbach, a partially sighted graduate student at Chapman University Film School, focused on audio production for movies.

Like everything this year, WWDC was held entirely online.  No audience, no interacting directly with new hardware and software, just video conferences, virtual one-on-one meetings with developers, and no after-parties.  This created an interesting side-effect.  Since the conference was held online, it was open to anyone with a developer account, internet connection, and a video-enabled device, bringing the conference to a wider range of demographics than ever before.  Everyone had the ability to attend the conference in real-time.  Mobility and navigation were not an issue since the event was done while the participant could enjoy the presentation without leaving their couch to attend the event.  The visually impaired did not have to worry about the difficulty or significant expense of flying out to Cupertino for a week.  Apple’s walled garden was finally opened up to all. 

It's been rumored for several years by different leaked sources that Apple is working on some sort of AR glasses.  With the addition of LIDAR sensors to the new iPad Pro models, scanning a three-dimensional environment with pinpoint accuracy is closer than before.  Nothing has been confirmed by Apple regarding their future goals for AR development.  Or has there…

Apple has added virtualization of audio playback through their devices with a simulated Dolby Atmos experience through the iPhone, iPad, and MacBook through their built-in speakers.  On Monday, Apple announced that Spatial Audio will be coming via firmware update to the Apple AirPods Pro, allowing for surround sound and Dolby Atmos to be played back through their wireless earbuds.  The partnership between software and hardware is at the core of Apple’s design philosophy.  Apple has enjoyed taking steps further in ways that are often surprising.

The addition of Spatial Audio to the AirPods, and the fact that both the AirPods and the iOS device it is connected to, separately track the movements of the devices in accordance with each other.  This brings a possible leap forward to AR… just not in a way that people have seen.

By adding Spatial Audio to the AirPods Pro, Apple has taken a step towards a more immersive AR environment.  For games such as Niantic’s Pokémon Go, sounds can be placed in the players environment to indicate where a Pokémon may be hiding.  Taken a step further, the same technology could be used to place the voice used to give directions in a location within a three-dimensional space.  Utilizing all of the available sensor data such as Wi-Fi, Bluetooth, Ultra Wideband, GPS, and barometric data within a user’s iPhone, Apple could place the voice at the destination for locating a store.  If you can’t read the gate numbers at the airport, the voice could be placed at the gate, helping to lead you to the gate.  This change could be a vital push forward for a person with vision loss to gain independence and feel comfortable traveling unaccompanied.

Spatial audio is especially important for the blind and partially sighted.  Apple is putting the pieces in place for Spatial Audio to be a tool for those with low vision to navigate their world. Imagine for a moment you rely solely on your hearing to navigate through a crowded mall.  If you can’t see your surroundings, how do you know where you are?  Using spatial airpods, the computer in your pocket finally kn0ws where you are looking, so apps will be able to guide us by sound direction and proximity.

“Here’s what it’s like to be visually impaired,” says Steinbach. “Sit on a park bench.  Close your eyes.  Listen to the kids on the playground.  The joggers as they pass by.  The athletes playing basketball at the court on the other side of the park.  Your ears are tuned to give a sense of direction and distance in a three-dimensional environment. In the mall it’s easy to find the food court by smell.  But how do you find the jeweler?  How do you find the tech store?  How do you find the little knick-knack store? If the navigation voice was enabled to utilize the encoding of Spatial Audio, it could give directions in a detailed, complex environment.”


This story was written in collaboration with Bradley Steinbach.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here