This article is the latest in AR Insider’s editorial contributor program. It originally appeared in WIRED, including commentary from AR Insider’s research arm, ARtillery Intelligence. 


Apple’s AR Glasses are Hiding in Plain Sight

by Peter Rubin, WIRED

PHOTOGRAPH: CARSTEN KOALL/GETTY IMAGES

With all the phone and watch and TV and game and chip and other chip news coming out of Apple’s big event, it was easy to forget the company’s longest-running background process: an augmented-reality wearable. That’s by design. Silicon Valley’s advent calendar clearly marks September as the traditional time for Apple to talk finished hardware, not secretive projects.

But those secretive projects have a weird habit of poking their heads into the light. A slew of features and language discovered recently inside iOS 13 and 13.1 seem to explicitly confirm the very thing Apple executives have steadfastly refused to acknowledge—an honest-to-Jobs AR headset. In fact, taken in conjunction with acquisitions and patent filings the company has made over the past several years, those hidden features have painted the clearest view yet of Apple’s augmented dreams.

Hard to StarBoard

First came StarBoard. At the very beginning of September, a leaked internal build of iOS 13 was found to contain a “readme” file referring to StarBoard, a system that allows developers to view stereo-enabled AR apps on an iPhone. The build also included an app called StarTester to accomplish exactly that. That marked the first explicit mention stereo apps—i.e., those that output to separate displays, like those found in AR/VR headsets—in Apple material.

Not long after, on the day of the hardware event, Apple released Xcode 11, the newest version of the company’s macOS development environment. Inside that set of tools lurked data files for what appeared to be two different headsets, codenamed Franc and Luck. The same day, iOS developer Steve Troughton-Smith found the StarBoard framework in the official “golden master” of iOS 13; he also pointed out references to “HME,” which many speculated stood for “head-mounted experience.” (HMD, or head-mounted display, is a common term for a VR/AR headset.)

So far, so unprecedented. When Apple first released ARKit in 2017, it was the beginning of a long journey to familiarize developers with augmented reality and get them playing with the possibilities. Yet, the company has always been careful to situate AR as a mobile technology, people peeking through iPhones or iPads to shop or play with Legos, or even experience public art installations. Finding this kind of data, even hidden deep within OS developer files, marks an uncharacteristic transparency from Apple—as though the company is planning something sooner rather than later.

What that thing might be depends who you ask. Reports from Bloomberg News and Taiwanese analyst Ming-Chi Kuo have long claimed that Apple would be beginning production on an AR headset this year for release in 2020—one that acts more like a peripheral than an all-in-one device, depending on the iPhone to handle the processing power.

Troughton-Smith came to a similar conclusion after poking through iOS 13. “The picture of Apple’s AR efforts from iOS 13 is very different to what one might expect,” he tweeted. “It points to the headset being a much more passive display accessory for iPhone than a device with an OS of its own. The iPhone seems to do everything; ARKit is the compositor.”

That idea of a passive display accessory got fleshed out late last week, when another developer got StarTester up and running on a beta of iOS 13.1, which officially comes out today.

That person also found specific numbers in the iOS framework referring to the fields of view for the two specific headset codenames: 58 and 61 degrees for Luck and Franc, respectively. (A third codename, Garta, seems to refer to a testing mode rather than a specific device.)

All of which matches up with the thought that Apple is planning a small, lightweight product—one that lives up to the term “wearable” by being more like smart glasses instead of an unwieldy Microsoft HoloLens. “Fifty-eight degrees doesn’t sound like much compared to an Oculus Rift, but compared to an nreal Light, which is 52 degrees, it’s already pretty competitive,” says JC Kuang, an analyst with AR/VR market intelligence firm VRS. “That’s the exact class of product we need to be looking at when we talk about what the architecture might look like.”

Mike Boland, chief analyst at ARtillery Intelligence, which tracks the augmented-reality mark, calls such a product a “notification layer,” and posits it as an introductory device of sorts—one that acts as a bridge between the mobile AR of today and a more powerful headset that could ultimately replace the smartphone. “I’ve always been skeptical of 2020,” he says. “If you look across the industry at the underlying tech, it’s just not ready to build something sleek and light.” However, an intermediary device like the one iOS 13 seems to point to could strike a balance, giving developers the chance to get used to building stereo experiences and develop best practices before needing to fully integrate with the “mirror world.”

A recent patent seems to support the idea as well. “Display System Having Sensors,” which Apple filed in March and was published in July, describes a companion system: a head-mounted device with inward- and outward-facing sensors feeds its inputs to a “controller,” which then “render[s] frames for display by the HMD.” A patent isn’t the same as a plan, obviously, but it’s a hell of a data point.

From Here to ARternity

How Apple gets from phone-tethered smart-glasses to something a fully realized spatial-computing platform—or how long it takes to do so—remains unclear, but elements of the road map are hidden in plain sight. “A lot of the tech they’ve already built and fully deployed is critical to their goal of building a discreet AR HMD platform,” Kuang says. As an example, he points to last week’s announcement that the iPhone 11 models could take photos of pets in Portrait Mode: “That’s a good example of them working in little tweaks that don’t appear to have relevance to AR, but are super-meaningful if you’re a developer. The ability to recognize nonhuman faces significantly expands your ability to build tools and experiences.”

Two acquisitions Apple has made in recent years also suggest how the company might get there. Kuang traces the current StarBoard testing mode to the 2017 acquisition of a company called Vrvana. At the time, Vrvana’s chief product was a mixed-reality headset—however, rather than rely on a transparent “waveguide” display like those in the HoloLens or Magic Leap One, it used front-facing cameras to deliver passthrough video to the user. (This is also how a company like Varjo delivers mixed reality using an VR headset.)

“It ruffled some feathers because nobody was really down with a discreet headset using pass-through,” Kuang adds of Vrvana. “But the StarBoard stuff presents exactly that: a Google Cardboard sort of functionality for iPhones. It’s obviously for testing purposes, but it maybe gives us a little more insight into how Apple has been testing AR without having to resort to building a couple of hundred waveguide-enabled devices for testing purposes.”

Apple’s other strategic move, buying Colorado company Akonia Holographics in 2018, looks to have two possible reasons: not just for the waveguide displays that Akonia was working on, but for the “holographic storage” that was the company’s original goal. The term, which refers to storing and accessing data in three dimensions rather than on the surface of a material (optical storage), has long eluded commercialization, but could prove pivotal to the long-term vision of AR. “The utopian vision of the end user device is super-lightweight and does functionally no computing compared to where we currently are,” Kuang says. “Everything happens on the cloud. The kind of speed and transfer that comes with holographic storage could be a key part of that.”

Kuang points to another recent Apple patent, published just last week, proposing an AR display that delivers three-dimensional imagery through an Akonia-like waveguide system. In his view, it confirms the company’s commitment to getting past the limitations of today’s devices—particularly the eyestrain that results from trying to focus on virtual objects and real-world ones at the same time. “The fact that Apple is acknowledging it’s a big problem and intends to fix it is huge,” he says. “It’s more than Microsoft can be said to be doing.”

It also suggests that while the iOS discoveries speak to an interim device, they’re also likely only just the beginning. Much has been made of Apple’s push into services to offset declining iPhone revenue; subscriptions like Arcade and TV+ are steps toward the company’s stated goal of making more than $50 billion from such services annually. But that doesn’t solve the question of what comes after the phone—and Boland sees AR as an integral part of any “succession plan” for Apple.

Kuang agrees. “It’s a very forward-looking vision for AR,” he says of Apple’s approach. “They’re treating it as a computing modality rather than a display modality, which is critical.”


For deeper XR data and intelligence, join ARtillery PRO and subscribe to the free AR Insider Weekly newsletter. 

Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.

Header image credit: Ray Ban