Skip to content

Apple Hiring AR/VR Frameworks Engineer For 'Entirely New Application Paradigm'

Apple Hiring AR/VR Frameworks Engineer For 'Entirely New Application Paradigm'

Apple posted a job listing for an ‘AR/VR Frameworks Engineer’, with the role described as “developing an entirely new application paradigm”.

The listing was shared by current Apple manager Hayden Lee on Twitter. A pioneer in the social VR space, Lee co-founded Convrge and Bigscreen. He was hired by Apple in mid 2020 to work on AR/VR, and made manager in October, according to his LinkedIn page.

A software framework is a collection of components, assets, and functionalities developers can use instead of reinventing the wheel. What makes this listing interesting is how it hints at the approach Apple might take to AR/VR software.

The role is listed as involving “distributed systems” – software which runs across multiple coordinated networked computers (or in this case headsets), and asks for experience with real-time. Here’s how the job itself is described:

You will be researching and developing an entirely new application paradigm – a challenge that will demand rapid experimentation and prototyping without sacrificing code quality or attention to detail. Working closely with Apple’s UI frameworks, Human Interface designers and system software teams – this role will push you to think outside-the-box, and solve incredibly ambitious and interesting problems in the AR/VR space. You will have access to a wide variety of internal frameworks and services that will allow you to build software that is deeply integrated into our operating systems.

The reference to “an entirely new application paradigm” that is “deeply integrated into our operating systems” in a networked framework role suggests Apple might be exploring providing developers with high level tools to build inherently multi-user spatial apps. Such a framework could result in apps which can seamlessly be run by multiple headsets in the same space, or across the internet, with synchronized components like UI which follow best practices.

All of this is possible today manually or with a combination of 3rd party frameworks in engines like Unity and Unreal. But with ARKit’s RealityKit and RealityComposer Apple has already shown a desire to provide higher level tools handling tasks like materials, shadows, physics, animations, spatial audio, and more. This lets Apple work on perfecting these core technologies while developers focus on building the actual content instead of needing to replicate each themselves. For the user, that means more consistency and quality between applications, even those made with a low budget.

apple reality composer

We have yet to hear details about the rumored rOS operating system for Apple’s rumored AR/VR headset, which could launch as early as next year. But this listing hints that despite all the hardware hype, Apple’s true innovation in this space could be a fundamentally different approach to software.

Member Takes

Weekly Newsletter

See More