Apple Brings Spatial Computing to iOS

Apple debuts new immersive features for upcoming iPhone 15 Pro models, as the firm plans deep cross-pollination across its device ecosystem

11
Apple Brings Spatial Computing to iOS
Mixed RealityInsights

Published: September 13, 2023

Rgreenerheadshot

Rory Greener

Apple is gearing up for the Vision Pro debut. The firm’s first XR headset is due soon, and with it comes a spatial computing ambition that draws similarities and differences to Lenovo, Microsoft, Meta, HTC VIVE, and other XR headset vendors.

However, a key difference and benefit that Apple has over some of its competition is the already-established iOS ecosystem of products and software. In its initial Vision Pro showcase, Apple highlighted how headset operators can leverage the Vision Pro spatial computing framework alongside other first-party products and services – from workplace applications to hardware such as the Apple TV.

If Apple successfully leverages its product ecosystem alongside its new MR headset, the firm could be in a unique position to offer an immersive device with a launch-ready lineup of compatible and familiar software and hardware – that users may already own – that could push usability and adoption rates.

Moreover, Apple is also pushing its Vision Pro SDK to support its first-party and Unity-supported third-party immersive applications, again in a bid to create a sophisticated piece of hardware with a system-selling lineup of integrated applications and hardware.

On the other hand, Apple is not the only XR firm pushing for interoperability and cross-pollination of hardware and software. For example, Microsoft is gearing up for a 2024 industrial Metaverse roadmap, which sees the firm working with groups like Magic Leap and Qualcomm to provide immersive workplace solutions that operate alongside pre-existing technology stacks.

Moreover, many alternative XR firms, such as Xreal, are leveraging Android and iPhone hardware to push separate spatial computing visions, therefore allowing for a vast range of hardware users to adopt a far cheaper AR product offering. Notably, Xreal introduced the Spacetop earlier this year, a screen-less spatial computing laptop that works with Xreal AR glasses. It allows users to interact with the virtual screen in various ways, such as using hand gestures to control the cursor or their voice to dictate commands – very similar to the Vision Pro promise.

While observers didn’t realize it at the time, the Xreal/Sightful Spacetop is incredibly similar to Apple’s Vision Pro and spatial computing roadmap. Just days after the Sapetop announcement, Apple revealed its Vision Pro product.

Both firms expect spatial productivity solutions to lead the XR market, with Xreal claiming the avenue will provide the world’s “first 100M AR killer use case,” meaning that laptop-replacing spatial computing solutions could be the first AR use case to gain a user base of 100 million people. Also, Apple most likely agrees that its own spatial computing vision will transform consumer adoption rates.

The Benefits of XR Interoperability and Cross-Pollination

Competition is clearly at large. From leading digital productivity firms like Microsoft to fresher players like Xreal, Apple will have to work hard during the coming 12 months to persuade general audiences and hardcore XR fanatics that the iOS immersive infrastructure is the next step in boosting productivity and revolutionizing the way the world interacts with digital information.

To lead the immersive space race, Apple is smartly and gradually integrating spatial computing features into the iOS ecosystem.

Already, Apple iPhone and iPad products come with many AR features. Increasingly, smartphones are the most common way audiences, both consumer and professional, interact with XR.

From placing AR filters on Instagram to placing high-quality digital twins on a construction site, smartphone cameras are incredibly sophisticated and allow end-users to leverage XR today.

However, compared to headsets, smartphone immersive integration comes with challenges from a developmental and usability point of view. A notable issue is phone auto-sleep functions that commonly create hurdles for AR placement services.

On the other hand, XR headsets can overcome smartphone shortcomings thanks to XR headsets’ purpose-built immersive functions. Combining pre-existing XR-ready smartphones with an immersive headset could significantly enhance the current XR use cases.

XR Comes to iOS!

Apple is aware of the benefits of device cross-pollination and interoperability. Recently, the firm debuted a series of immersive features for its latest iPhone product to fall in line with the more comprehensive Vision Pro roadmap.

Earlier this month, Apple introduced plans to debut a new App Store for the Vision Pro that distributes services and games for the upcoming MR headset. Coming with the new storefront will be “hundreds of thousands of iPad and iPhone” applications that work on the Vison Pro OS (visionOS).

The move allows pre-existing and new iPhone and iPad applications to work on the Vision Pro device. During its WWDC showcase event, Apple already highlighted how users could leverage classic first-party productivity applications on the Vision Pro device.

Now, with the Vision Pro App Store and the general availability of the device’s SDK for developers, Apple is opening up its upcoming MR headset for a trove of system-selling services ready for its 2024 debut, covering various use cases from gaming to communications to productivity to entertainment.

Moreover, Apple notes that the cross-pollination of iOS applications to the Vision Pro allows “most iPad and iPhone” services to work on the Vision Pro. Therefore, developers can extend an application’s audience reach with “no additional work required.”

Apple will, by default, publish iPhone and iPad applications on the Vison Pro App Store automatically, significantly increasing the number of use cases for the product instantly.

In an official statement, Apple said:

Most frameworks available in iPadOS and iOS are also included in visionOS, which means nearly all iPad and iPhone apps can run on visionOS, unmodified. Customers will be able to use your apps on visionOS early next year when Apple Vision Pro becomes available.

The Vision Pro App Store is coming this fall, says Apple. The firm is also working on an upcoming beta release of visionOS that will include the App Store; this should prove prosperous as developers are already seeing great success with the available SDK framework.

Despite strides in porting iOS services to visionOS, Apple states that porting a 2D application to the Vision Pro spatial computing SDK is not simple or easy. However, Apple is supporting applications which may face issues.

Via App Store Connect, Apple will notify developers if their application is incompatible and won’t be available on the Vision Pro. If an iOS application requires a capability unavailable on Apple Vision Pro, Apple will support developers in providing an alternative functionality or updates to a service’s UIRequireDeviceCapabilities to suit visionOS.

Moreover, in a recent post from Apple, Chris Flick, an Apple AVFoundation Engineer, stated that Apple is “supporting a new spatial experiences paradigm.”

Flick also added:

We’ve made it as easy as possible to bring your existing 2D content to a spatial experience. With some small modifications to your current 2D pipeline, you can support 3D content using MV-HEVC. You can even continue to use all your existing captions from 2D assets. But if you provide timed metadata, those captions can be unobscured and provide a comfortable viewing experience.

Developers can also test to see if an application is supported on the Vision Pro via a visionOS simulator available in the Xcode 15 beta, which allows developers to “easily” test a service’s core functionalities in a spatial computing environment. To access this testing service, developers must submit an iOS application for compatibility evaluation or sign up for Apple’s developer lab.

Spatial Recording comes to iOS!

In other news, Apple recently released updates concerning its upcoming iPhone 15 Pro. Within its showcase earlier this week, the firm debuted a lineup of new features to sell the latest smartphone iteration to audiences.

In terms of XR, Apple is introducing spatial video recording for the latest iPhone model. The features enable device owners to create 3D captures of moments, therefore allowing Vision Pro operators to play the footage back and watch it as an RT3D MR visualization – viewable from various angles.

Spatial media streaming is coming to Apple smartphone devices next year, significantly improving the product portfolio’s XR abilities from its AR-lite beginnings and readying the massive user base for the Vision Pro 2024 debut – hopefully swaying smartphone owners to buy Apple’s new product.

The news comes as Apple seemingly plans to integrate XR features into other areas of its product portfolio.

In August, trusted Apple Analyst Ming-Chi Kuo noted that the upcoming AirTag 2 product may come with immersive tracking features. Kuo said that the updated tracking device will debut in the fourth quarter of 2024, and compatibility with the Vision Pro will come with its launch.

Kuo also believes spatial computing is “a new ecosystem that Apple wants to build, using Vision Pro as the core to integrate other devices, including AirTag 2.”

In 2023, reports emerged highlighting how code in iOS version 13 suggests that users will soon be able to find AirTags via AR visualizations.

AirTag 2 with AR locational features could easily integrate into the Vision Pro MR framework. However, the AirTag AR features may work in tandem with Apple iPhone devices, as modern iPhones come with cameras that support AR visuals, perhaps meaning that iPhone users can find their AirTag 2 devices via AR visual displays through a device’s camera which will now be enhanced with spatial recording capabilities.

Moreover, recent reports emerged showcasing how Apple is developing an AR windshield product – a conceptual product the firm first started working on in 2015!

Based on the patent request, Apple will leverage sensors like infrared cameras, LiDAR, and traditional cameras to enable AR visualizations that improve vehicle usage and safety features.

According to the patent, Apple’s AR driving solution could show data like speed bumps and also stabilize driving performance by communicating with the vehicle or displaying driving restrictions such as school zones to alert drivers of speed limits. Moreover, additional passthrough options could provide views of traffic signs hidden behind objects.

First-Hand Developer Experience and Porting iOS Applications to XR, Today

Developers are already having great success with the visionOS SDK. Last month, Apple released a series of curated developer experiences highlighting the benefits of the SDK and working from one of Apple’s Developer Labs, a space for leveraging the firm’s hardware, software, and resources to create applications across the Apple OS ecosystem. The spaces also let developers test and optimize immersive applications for visionOS while encouraging XR developers to create content under the guidance of Apple – helping to grow the developer talent pools and innovation.

For example, Michael Simmons, the CEO of Flexibits, said that the experience of working in an Apple Lab for the first time was “fantastical. It felt like I was part of the app.”

Simmons worked at a Cupertino-based lab, which they described as a “proving ground” for immersive application innovations and growth. Furthermore, he noted how the lab provides a space for XR developers to push the entire Apple OS ecosystem beyond its limitations. In combination with the Vision Pro device, XR creators working from a Developers Lab can experience Apple’s brand of spatial computing first-hand, expanding their AR/MR/VR knowledge.

Simmons explained:

A bordered screen can be limiting. Sure, you can scroll, or have multiple monitors, but generally speaking, you’re limited to the edges. Experiencing spatial computing not only validated the designs we’d been thinking about — it helped us start thinking not just about left to right or up and down, but beyond borders at all.

Another XR developer, David Smith, noted how the Apple Lab he visited “checked everything off my list.” The developer explained how working with Apple Vision Pro and spatial computing first-hand helped Smith understand how to develop for the “boundless canvas” that the visionOS provides – remarking how the learning space enabled Smith to “stop thinking about what fits on a screen” and therefore helping themself and others “make better apps.”

Smith also gave details on the visonOS’ on-site testing feature, adding, “I’d been staring at this thing in the simulator for weeks and getting a general sense of how it works, but that was in a box. The first time you see your own app running for real, that’s when you get the audible gasp.” The experienced developer remarked how an Apple lab visitor could start to understand questions only answerable once a developer is working with a Vison Pro device.

Smith also noted:

It’s not necessarily that I solved all the problems — but I solved enough to have a sense of the kinds of solutions I’d likely need. Now there’s a step change in my ability to develop in the simulator, write quality code, and design good user experiences.

Chief Experience Officer of Spool Ben Guerrette, noted how his team leveraged the Apple XR lab to explore new spatial interactions for its smartphone application, explaining how the lab helped his firm take its screen-based game to an RT3D environment, adding how the facility provides an “incredibly valuable” learning experience which “gives us the chance to say, ‘OK, now we understand what we’re working with, what the interaction is, and how we can make a stronger connection,” Guerrette remarked.

Finally, Chris Delbuck, a Principal Design Technologist at Slack, explained how the labs allow developers to get hands-on with the Apple Vision Pro device, therefore allowing developers to understand the potential of XR and spatial computing – “It instantly got me thinking about how 3D offerings and visuals could come forward in our experiences,” Delbuck added.

Enterprise-Grade Services are Coming to Vision Pro

The curated feedback comes as enterprise-grade solution providers are leveraging the available SDK to ready productivity solutions for Vision Pro’s 2024 debut.

Unified communications (UC) solutions provider Wildix is bringing MR enterprise applications to Vision Pro, starting with debuting its esteemed digital workplace solutions on Apple’s XR headset.

Dimitri Osler, the Founder and CTO of Wildix, explained:

This just shows how effectively development can happen within small teams dedicated to the purpose. Just like with WebRTC, we see the potential of this technology, and we are proud to be the first to have created it.

Wildix is porting its UC software to the upcoming headset, allowing Vision Pro users to access meeting, text chat, and phone call features within the spatial computing OS.

The immersive UC service also allows Vision Pro users to access Wildix’s brand features in fully developed virtual meeting rooms – accessible for worldwide users wanting to connect and collaborate.

Many other immersive services, such as the Metaverse application Rec Room, are coming to the Vision Pro, and as Apple’s release window comes closer, more applications will surely arrive on the device.

More on Apple Vision Pro

While some audiences are sceptical over Apple’s pricey entry into the market, competitors recognize Apple’s potential. Earlier this month, reports highlighted how Google and Samsung face a “great fear” when Apple releases new products.

The Vision Pro device is due in 2024 for $3,499. Apple’s Vision Pro headset sets itself apart using a three-layered approach to ensure users have an accessible, easy-to-use, and flexible spatial computing interface.

  • Windows that represents the device’s 2D user interface.
  • Volumes that provide RT3D immersive experiences.
  • Spaces create the spatial computing environment in which Volumes and MR applications exist.

This three-layered approach to Apple’s Vision Pro contains competitive features that empower its three-tier spatial computing structure – including a custom M2 Silicon chip, Apple’s purpose-built R1 graphics processor, a 23 million pixel display across two huge micro-OLED lenses, high-dynamic range (HDR), wide colour gamut (WCG) outputs, 2-hour battery life, an immersive camera for capturing spatial audio/photos/video for peer-to-peer sharing, iPhone/iPad/Mac synchronization, a light seal, a LiDAR scanner, and a TrueDepth camera.

Despite growth and outreach plans via its Lab spaces, Apple is facing some hurdles in getting Vision Pro devices to market.

In July, Reports suggested that Apple’s overseas manufacturing partner Luxshare Precision Industry Co. reduced its initial product assembly forecast to 400,000 units, down from Apple’s 1 million unit forecast and Luxhare’s internal forecast of producing 18 million units annually in the coming years.

Two of Apple’s component manufacturing partners also reduced production forecasts to roughly 130,000 to 150,000 units.

According to the July reports, Apple is facing manufacturing “complexity”, leading to “difficulties” in production stemming from its micro-LED and curved, outward-facing lens. The firm also expressed dissatisfaction with some of its production partners,

The news came following reports highlighting how the production of Apple’s Vision Pro device will only cost its manufacturers roughly $1,590, far less the device’s $3.499 market price.

Assisted RealityImmersive ExperienceMixed Reality Headsets

Brands mentioned in this article.

Featured

Share This Post