#1216: First Impressions of Apple Vision Pro from Two XR Developers at WWDC

Apple finally announced their mixed reality headset called the Apple Vision Pro at Apple’s WWDC keynote on June 5, 2023. It’s a $3500 device that by accounts is the most advanced XR headset produced up to this point that’s largely driven by eye tracking, hand gestures, and speaking. I was not able to score an invite to be at WWDC myself, and so I was closely tracking all of the news in my WWDC23 Twitter thread. But I wanted to get some first-hand testimonies from XR developers who were on the ground at Apple Headquarters during the big announcements to get their perspectives, reflections, and insights after being able to speak with Apple employees who were there. I had a chance to speak with Sarah Hill, CEO of Healium, and Raven Zachary, COO of ARound on Monday evening after the big announcements. Neither one were able to get a hands-on demo, but they were able to lay eyes on the device and were able to bear witness to this historic announcement first-hand.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I have a chance to talk to a couple of developers who are actually at the Apple keynote at their worldwide developers conference, their WWDC, which On Monday, June 5th, 2023, they announced the Apple Vision Pro. This is their entry into the XR, or Mixed Reality, Augmented Reality, Virtual Reality. They actually prefer to use the term Spatial Computing, and they did say Augmented Reality, but it's their entry into the XR ecosystem. So the list price is $3,500, which is probably not going to be in the budget for most normal people. It is a pro device, which means that we're going to see a lot of developers and enterprises and lots of folks who are just tech enthusiasts who want to have the latest gadgets. But I don't expect this to be a hugely popular headset, but I do expect in the future that this is Apple's first foray into a long journey into trying to develop each of these different technologies. They want to start with the absolute best. I had a chance to talk to a couple of developers out there. Raven Zachary, he's a chief operating officer at Around. They do augmented reality experiences for live events, as well as Sarah Hill, she's the CEO of Helium, which does a lot of meditation applications that has been on a lot of different XR platforms, often to the App Lab, struggling to get into the main store. So that's part of the larger dynamic that Sarah is really interested in seeing what's happening with the XR ecosystem within the context of Apple. This was some initial impressions. Both Raven and Sarah were on site. I tried my hardest as I could to get some back channels and try to finagle some sort of invite, but I was not on the radar for the comms people. I have listeners that I know that are within the Apple compound. And so if you are within Apple and you enjoy the Voices of VR podcast, do me a solid and send a note to some of your comms folks there at Apple. I'd love to get on the invite list the next time that there's going to be any type of demonstration of some of these technologies. I always appreciate being able to try all these things out and to do interviews with folks who are there. So. Yeah, unfortunately, I was not able to get an invite to this event. And so I was watching from afar. I like to watch things in real time and do a whole long Twitter thread. So I was trying to record everything that was happening during the keynote. And then afterwards, there was a platforms, State of the Union that I went through, and then just scouring through all the different media reports and testimonials and just trying to get as much information as I could. So I really wanted to talk to both Raven and Sarah, who were kind enough to give me a call as they were leaving the main campus from Apple. And we had a whole chat and unpacking both the events of the day and some of their take on what the Apple Vision Pro means for the larger XR industry. So that's what we're covering on today's episode, the Voices of VR podcast. So this interview with Raven and Sarah happened on Monday, June 5th, 2023. So with that, let's go ahead and dive right in.

[00:03:20.797] Sarah Hill: Yeah. Hi, everyone. I'm Sarah Hill. I'm the CEO of Helium and we create mental wellness content that's powered by biofeedback.

[00:03:31.406] Raven Zachary: Great. Hi, I'm Raven Zachary. I'm the chief operating officer of a company called Around that's around AR.com. We do augmented reality for fan engagement. So we work with professional sports teams on the future of augmented reality and stadium experiences.

[00:03:46.407] Kent Bye: Great. I'd love to hear a little bit more context as to each of your backgrounds and your journey into XR.

[00:03:52.805] Sarah Hill: Yeah, absolutely. I apologize that I'm joining you. I'm just leaving Apple Park after the announcement today. And I've been in immersive media since the days of Google Glass back in 2013. We were giving a group of veterans virtual tours live streaming from our face at the World War II Vietnam Korean Memorial in Washington, D.C. I'm a former television journalist and developed Helium for me as well as the millions of people who struggle with anxiety. So my background is as a storyteller, an immersive storyteller and a creator. And yeah, excited to share about what we learned today and the excitement around everything that was announced today at Apple Park.

[00:04:39.465] Raven Zachary: Yeah, and like Sarah, I too am coming back from Apple Park. You'll see the steering wheel of my vehicle here on my video feed. I pulled over into a rest stop. Both Sarah and I are just live, fresh coming off of the campus experience and headed back. I've been in 27 years in tech. Most recently focused on the HoloLens, a device that many of you will know as Microsoft's foray into the mixed reality space. Prior to that, actually worked with Kent in the Portland, Oregon area on creating the Portland virtual reality meetup. And before that was involved in the iOS and iPhone app space, building apps for President Obama in 2008, the first Starbucks app, Amazon, Whole Foods, Ticketmaster, Live Nation. So really been interested in the intersection of Apple's tech and the mixed reality space.

[00:05:28.689] Kent Bye: Great. And so I know I was at Augmented World Expo last week, and there was certainly a lot of buzz and excitement about this impending Apple announcement, which ended up being the Apple Vision Pro. There was a list price of around $3,500. And my first reaction to that is that's going to be out of a lot of people's price range. And then I don't know if this is going to be more of a developer kit for folks or, you know, cause I've heard from some folks like Irina Cronin, who said that Apple estimates that there may be around a hundred to 300,000 of these units expected to be sold. But maybe we'll just start this conversation by hearing a little bit more of your first impressions of this announcement and the vibe that you were hearing on campus as you're watching this announcement live.

[00:06:09.260] Sarah Hill: Yeah, to me, that was the real story of it wasn't necessarily the hardware that I've been excited about, although it looks impressive. Just having seen it, they had hundreds of people through Steve Jobs theater just a few minutes ago to lay eyes on it. But it's the ecosystem and the fact that it's world class hardware combined with an ecosystem, a built in ecosystem that people are familiar with. They know where to get content because they do it all the time on their iOS devices anyway. And to me, that's more important than any piece of hardware, any dynamic foveated rendering, or spatial audio ray tracing, or anything like that. And it's been tough for XR creators, right, in building content for these platforms. Specifically, if your technology integrates with any other kind of third party device, And so it's been tough to even get your apps discovered, right? With App Labs and creators been stuck in App Lab for years. We've been there for three years, right? Three years. And we've managed to find a way to get our content distributed in enterprises. And to me, that's the real exciting story about this device is its value for enterprises. And I don't know about you, but that was what I was excited about. There was an audible gas. Other XR creators in room, like some of them had, they were crying. Literally, they were throwing their hands up in the air. It was a huge celebration. That excitement, again, not necessarily about the hardware, but the established ecosystem.

[00:07:47.891] Raven Zachary: Yeah, jumping back to a point you made at the beginning, Ken, about is this a dev kit? Just to clarify, this is definitely not a dev kit. This is a pro device and Apple is positioning it for the pro market. If you go to their developer website and dig down, there is a page that says come back in July because we will talk about dev kits. Apple has done dev kits in the past for some of their technology. And what generally happens is there's a request form, you fill it in, Apple sends a device, generally, it's a preview. So think of it as like an engineering evaluation unit, that then you have to send back after a set period of time and allows developers to build things for the launch of a new product. So think of this as a pro product, think of this as meeting the needs of a different set of folks than the people who have bought a meta quest. and it's a different price point, it's a different kind of consumer. I don't think it's going to only sell 100 to 300,000 devices. I think you're going to see much higher numbers. And yes, it's expensive, but I've got an Apple card, as do lots of Apple developers, and I can put a no-interest three-year payment plan on a device, $95 to $98 a month for three years. I think a lot of developers are going to do something like that to get their hands on a device and to have their apps ready for a new ecosystem.

[00:09:09.933] Kent Bye: All really good points. And Sarah, to go back to one of your points in terms of the ecosystem is that I was doing a whole Twitter thread and I was covering the entirety of the keynote. And I'm sure for any of the XR developers who are watching along, the first two thirds were essentially all of their existing ecosystem updates and probably not all that exciting up until the point where they actually announced the Apple Vision Pro. But what was striking to me was that they have both macOS, iOS, iPadOS, watchOS, and now they have VisionOS. And so you have all these existing operating systems and over 300 different frameworks that are tying this whole software ecosystem together with so many existing applications that are already on all their other platforms. And so as far as I could tell, you're going to have essentially more or less all of these different applications that are going to be available, at least in 2D mode when they launch this thing. And maybe they're going to have some of these apps are adding different 3D components. And when you have like 4K per eye, micro OLED displays, you're not going to be able to use glasses. You're going to have to buy Zeiss inserts.

[00:10:13.818] Sarah Hill: Yeah, they'll, they said they'll be at launch. There'll be a prescription inserts available at launch.

[00:10:19.725] Kent Bye: Okay. So yeah, so you'll, you'll have the inserts of the launch, but at this resolution, it seems like that this is going to be past a critical threshold. We're going to be able to potentially have this as a screen replacement.

[00:10:31.143] Raven Zachary: Yeah, it seems like Apple has proven that you can have high resolution pass-through as the solution for augmented reality. And this was a huge debate for many years in the VR AR community around what's the right technology for augmented experiences. And, you know, Microsoft went with very much the other model, which was to do a semi-transparent display where you could see mixed reality images in your physical environment. Whereas Apple has gone with a high resolution pass-through solution, I need to try it before I have an opinion, but the people who have tried it seem to be quite impressed.

[00:11:06.310] Sarah Hill: What was interesting to me was to see, we've all been talking for years about blending the differences between XR. XR wasn't mentioned, the Metaverse wasn't mentioned at all, and you really saw it come through in their demos. is that they weren't showcasing only 3D content. They were showcasing 2D apps alongside 3D apps. And, you know, the ability to be ambidextrous as creators, we've been creating AR and VR content because we knew that there would come a day when that seamless handoff between a 2D app and a 3D space. So I thought that was fascinating. And, you know, with the ability to turn on a wheel between fully immersive and mixed reality, You know, that's why we've been ambidextrous and playing in both of those spaces.

[00:11:59.011] Raven Zachary: Yeah, speaking to your point around terms, I mean, Apple definitely still is in the augmented reality spatial computing terminology camp. This device is being treated as an augmented reality slash spatial computing device. I did see one unchanged XROS reference in the platform State of the Union talk when they showed the simulator, the simulator was still showing the code name XROS, which had leaked as a potential product name in the trademark filings i think a couple months ago but they've gone with vision vision os and the vision pro instead of an xr or a reality nomenclature.

[00:12:34.361] Sarah Hill: One of the other things I noticed on the device itself, as far as the fabric, because if it's an enterprise product, obviously enterprises won't touch fabric headsets, not just in healthcare, but also in regular employers as well. They don't want any kind of fabric straps. And so as we heard some of them talk afterwards that there are some third party fixes to that, that would enable not having hygiene concerns over that fabric.

[00:13:03.490] Raven Zachary: Yeah, it's a good point. The other thing that concerns me a little bit small point but worth mentioning on this is that I talked to an engineer on the team and I asked about kiosk mode and multi user mode and they said no, this is going to be just like your iPhone. And if it's just like your iPhone, I'm not going to want to hand this device to somebody I don't know on a personal level to have them. use it because it's going to have my personal information within that experience. Now, I'm hoping that that engineer who gave me that information was incorrect and that I'll have the ability to hand the device to someone in a mode that doesn't have my email and my browsing history and all the other stuff visible to them.

[00:13:42.043] Kent Bye: Well, yeah, they did say that there was going to be this thing called the optic ID that would be able to identify you with your retinal scans and privacy came up again and again throughout the course of their presentation. And they have a privacy by design architecture. Yeah. So they definitely have these different ways of identifying you. And I'm not sure if they're going to have different users and ways to kind of separate and segment out this information based upon who's logged in to these systems. But I imagine that this may actually be a system that is more used for a single person, mostly because of the way that the lenses are set up. I mean, you're going to have to buy the Zeiss corrective lenses if you do have glasses. because there's no room to wear your existing glasses. And there may be people who buy all the different inserts and swap them out. And certainly at the Magic Leap LeapCon demos, they were doing that. But I've also seen Magic Leap demos where they didn't have that. And it was not a great experience because it wasn't very clear. So I don't actually expect this to be a very good use case for the headset to be used in these location based experiences, but it does seem like it's probably geared more for individuals to be able to use it in more of these productivity use cases. They did say that the battery was around two hours and that there was this external battery pack that you like stick in your pocket to have a little bit more of a lightweight affordances for the headset so that all that weight's not actually on the HMD and that people were able to maybe tether it. I don't know if you heard anything about tethering

[00:15:08.864] Raven Zachary: They do support. I did hear two different people on the team say that you can unplug the battery and go direct to a USB-C wall adapter, which means all of you devs out there, it's time to invest in a 30 foot long USB-C cable.

[00:15:23.657] Kent Bye: Nice. Sarah Raven, did you have any other insights as you were talking to other developers as you're on site? There was the main keynote that happened and then there was a platform State of the Union, which I took the time to watch as well. I did a big long Twitter thread to cover as much as I can, but I'm just curious to hear what the day was like for you after the keynote and if you watched the platform State of the Union or just had other opportunities to engage with folks from the team.

[00:15:47.546] Sarah Hill: Yeah, the platform State of the Union, one of the things that they mentioned was WebXR. And afterwards, one of the engineers said that WebXR would be experimental in Safari, just as it is on your iPhone right now. So that was kind of interesting and opens up some new possibilities of that WebXR inside.

[00:16:09.685] Kent Bye: So there was nothing that was said during the keynote about WebXR. I was listening very closely to see if there was any WebXR updates. I did have a chance to talk with Brandon Jones, who was one of the spec editors for WebGPU. And we also talked about WebXR and mentioned that Ada Rose Cannon is one of the spec editors for WebXR and is now on the Apple team. And there's actually a meeting about WebXR that happened at Apple that Brandon talked about in my podcast. And there was also Terry Schussler was saying on Twitter that he was talking to some developers afterwards and that there would be an experimental flag for WebXR that was launching with Vision Pro. So there was early news that it might be under an experimental flag and maybe we'll learn a little bit more once there's more sessions throughout the course of the week at WWDC. But it's definitely been a little bit frustrating for anyone who's been developing for WebXR just because it's been out in Chrome since like 2019 and we're just all waiting for it to ship within Safari.

[00:17:03.810] Raven Zachary: Yeah, there were several times that when we went to the spatial computing discussion with engineers that they would say, we're not ready to announce that today. There will be a session on Wednesday or Thursday where we cover that topic. So my guess is that WebXR just did not fit into a keynote. It doesn't really affect the average iOS developer, the core audience of who was going to attend and who was going to watch this talk. So it's going to get parked into a Safari or a web technologies subsession. And that's probably where we'll hear about it.

[00:17:34.172] Kent Bye: Yeah, in terms of some of the different applications that they were showing, they're mostly things like entertainment, watching movies, productivity, and a lot of things like screen replacement type of applications. And they did show five different developer applications that were from the community. They showed Complete Heart X, which was basically like an anatomy viewer. They had a couple of things that were viewing spatial CAD with Jigspace. They had a Formula One car that they were showing off. They had Stages from PTC, which was reviewing and approving different CAD drawings. DJ DJAY, which was basically mixing tracks and using your hands to do a little bit of interaction with your hands to modulate the music. And then there was Sky Guide by 5th Star Labs, which was a little more like a personal planetarium. But for me, the thing that was of particular note was that there weren't any controllers. There's no buttons. There's just the ability to click your hands. And it's got 12 different cameras, five depth sensors, and a whole R1 chip to process all these different sensors. But basically eye tracking to do guiding between these different icons and clicking with your hands. and being able to use your voice to interact as well. So it's a little bit concerning just because there's no ability to do some of these direct ports over with these existing XR experiences that are using all these different controllers and control mechanisms. But maybe it's actually going to force the different applications into a completely new user interface that's completely hands-free and not really tethering people to these controllers with buttons that can be confusing for people who are brand new to the industry. On the one hand, maybe it's going to be difficult for a lot of the existing XR developers and forcing them to reevaluate how to interact with the whole range of different types of experiences. But on the other hand, it may be opening up doors to completely new user interface opportunities, especially with the eye tracking as the primary mode that people are going to be engaging with. So I don't know if either one of you have some initial thoughts on that.

[00:19:24.987] Sarah Hill: I certainly think it'll be easier to port mobile augmented reality apps than other VR apps, just from having mapped controllers from one headset to another. I'm not sure which will be easier, from one headset to another, mapping controllers on one headset to none at all. But they indicated that they will have developer labs in Cupertino, in Shanghai, in London, and some other different places where you can go after you port your app and they can check it in person for compatibility. So you won't have the physical device to check it, but you can go into Cupertino and they can check it. And I believe the signup for that was shortly after the SDK is out in the wild. One of the things that I was excited about, certainly from having a company that builds with Unity apps, is that AR Foundation is all going to be integrated in there. And it's all the same tools and toys creators are familiar with. So that will be helpful and SwiftUI and ARKit as well as learning the new Vision OS. So I was excited to see that.

[00:20:39.611] Raven Zachary: Yeah, the comment, I think Unity is really important. It's great to see that Apple cares about existing AR and VR developers and has Unity as a pipeline to get an app in here. But an opinion that may not be particularly popular for your listeners, AR and VR developers aren't really Apple's target here. And to draw a parallel to what we saw with the launch of the iPhone, the people who really benefited from the iPhone were not the Symbian and the early Windows portable and mobile developers. It was the Mac developers who already had existing understandings of Objective-C and how to create applications that were Mac-like for the iPhone. Apple does care about AR and VR developers, but they care a lot more about Swift developers. And so I think when you saw that the keynote was so focused on 2D apps at the beginning of the Vision Pro launch, they're really trying to meet the needs of a developer base that probably isn't doing AR and VR apps today. And they wanna grow this market. That doesn't mean that AR and VR developers who have been building for other platforms don't have an opportunity. They do with Unity, but Apple's not targeting this for us. And I say us as kind of the voices of VR listeners, community, the people who go to AWE, the people who have gone to SVPR in the past, that's not their target. The target is the guy who built an iPad app who wants to do more.

[00:22:04.799] Kent Bye: Yeah. Yeah. So there's going to be these vision pro developer labs that they're going to be in London, Munich, Shanghai, Singapore, Tokyo, as well as in Cupertino. So it sounds like that even if there's not going to be dev kits for everyone, if you want to try out their actually going to be releasing a lot of these developer tools within the next couple of weeks of their Vision OS SDK Xcode in the simulator, as well as the Reality Composer Pro, all this new stuff to tie together and bring in different 3D assets and animations. All that's launching within the next couple of weeks and you'll start to be able to build stuff out by the end of the month. But in order to actually test it out on their hardware, you're going to have to either go to one of these cities and try it out. Or there may be sending out some dev kits for some of the premium developers that are able to get access to some of these. I did notice that while they were showing some of the different applications, there was a shot of Rec Room. I know Sean Whiting on his Twitter had highlighted that Rec Room was listed as one of the different applications during one of the demos. But Rec Room is an application that actually requires controllers to be able to look and move around. Maybe they're going to have a 2D interface for that where you're just using a controller or I know that VRChat has developed some hand-controlled ways to be able to locomote around in their system, playing around with the Quest hand-tracking controllers to locomote and move around and to gesture and to operate through their menus. And so I know they're already starting to play around with that. I imagine that there's going to be developed all sorts of different gestures to be able to engage in more sophisticated ways with these systems. But at the moment, if there's no controller, then how do you navigate, how do you move around? So that's going to be some interesting points to try to figure out as this all continues to develop. And it's not necessarily designed in a box to be able to be for the XR community, but maybe pull in some of these existing ecosystems within Apple. And I think to go back to what Sarah said, this is the ecosystem again and again, like you're looking at their laptop and you click a button or whatever, and then it essentially turns your laptop monitor into a floating 4k monitor with resolution that you can move around in space. And so all the ways that they're going to have these different integrations with their ecosystem with either the Mac, iPhone and watch. So I'll be very curious to see how they continue to integrate all these different systems together. But it's really quite interesting to see how this fusion between the existing 2D app ecosystem is evolving and moving into these more spatialized versions for each of these I think for anybody who's a seasoned XR developer, they may see some of the different demonstrations that are being shown, and there's nothing in there that's necessarily like super XR or even virtual reality. I did note that they never actually said virtual reality, and they did mention augmented reality at the very beginning, and they did emphasize spatial computing, but they also never said artificial intelligence, and they always would say machine learning instead. So their choice of language here seemed very deliberate. And I guess the other thing that I'll point out is that I would say it's all about collaboration, communication, productivity, and entertainment so far. And they have these things called digital personas, where they showed a guy basically holding up one of the Apple vision pros to be able to take like a selfie, to get a spatial capture of their face. And then this creates a high resolution, what they call a digital persona, which is essentially like an avatar. So that when you're occluded within VR or within these headsets, since I guess they're not really calling it VR, but I think of it as VR, but When you're inside the headset, other people would see an animated version of your face because they're using facial tracking, eye tracking, the hand tracking to be able to convert this into this digital persona. And then when other people are in physical reality with you and you're looking at them through a mixed reality pass-through, then they actually see your eyes through this OLED that's in the front that's capturing your face through the eye tracking and all the different sensors and then projecting that onto that curved OLED screen that allows you to maintain eye contact with other people in the physical reality around you, which I thought was a very interesting way to show how they're really thinking about how to make sure that when you're in this device, you don't feel like you're isolated or disconnected or that you can both see the people in the world around you, but also that people can see what you look like and what your eyes are doing in these experiences.

[00:26:11.762] Sarah Hill: And if anyone listening wants to get a look at what it looks like themselves, Apple has released a AR asset, essentially, that you can put it right on your desk and look at it and see what it looks like. So I think that's pretty fascinating. After the sessions, after the platform sessions, spatial computing. That was what they said as well, not metaverse or VR. But you could see the joy on the faces of the team members who said they'd been working on it for six years. Right. And they're like, we haven't been able to talk about it. You know, and, you know, finally, they said it was kind of like birth and a baby, you know, that they can finally share what they've been working on. a little bit of fictional finalism in a way for some of them in that it's like, well, you know, we knew this day would happen, but, you know, to see it out and see people talking about it and seeing up in big stream had to be incredibly validating for them.

[00:27:12.553] Raven Zachary: Yeah, the other important point to make is this is the first of many, many, many generations of Apple's platform with Vision OS. And it's going to grow and evolve and change over time. And what you see today is not the product that you're going to have two or three years from now. And you think about the tremendous amount of change that the iPhone went through in the early few years when it first launched. I mean, remember the first year, we didn't even have a native OS. We had to do web apps. And, you know, there was considerable limitations to the iPhone platform, and it evolved very dramatically over the first four or five years. That's going to happen here. And there's going to be a product that's not a pro product, that's going to be a lower price point, that's going to be lighter weight, and Apple will continue to evolve this. But if you look at Apple's history, they don't generally sacrifice margin for market share, right? Apple comes at it at a premium price point, builds the best possible product in the market, and will gain customers by having a premium product. And this is true in virtually all of their platforms. And it's a comfortable place for Apple to be. And, you know, Meta can make all sorts of devices in the sub 1000 or the sub 2000 price point and do well in this environment. There is room for multiple hardware companies, but Apple has seized the high end. And I don't know what that means for the high end VR companies. I don't know what it means for Magic Leap. I'm a lot more scared for Magic Leap than I am for Meta right now.

[00:28:50.933] Kent Bye: Yeah. Yeah. And although there may actually be a sort of a deal between Meta and Magic Leap that Magic Leap may have licensed some different aspects of their patents or hardware, or maybe they're even potentially working more directly with Meta on some of their AR headsets, but there's been some reports or rumors about that, but nothing that I've confirmed. Yeah, but just the fact that Apple has been able to have been working on things like ARKit, it's been out there for a long time in their mobile ecosystem. And so they have this fusion of all these different existing frameworks that they're bringing together with Apple Vision Pro, where they're combining things like the Swift UI and UIKit that's doing a lot of the user interface type of stuff. that has been typically for their existing iOS apps and whatnot. But that's really been for 2d interfaces and not so much some of the more 3d user interfaces. But then there's a reality kit that's being used to present the 3d content, animations and visual effects. And then they're bringing in the AR kit to be able to understand the space around you. So to do things like Plane estimation, reconstruction, image anchoring, world tracking, and hand tracking. Hand tracking is actually new with this version of ARKit for Apple Vision Pro. That's something they're adding that hasn't been a part of the 2D version, but they're adding it to be launching for App and Visual Pro to be able to actually track your hands. So that's a part of the ARKit now. And so there's ways that they're already have an existing ecosystem of all these different frameworks, including ARKit, that's been doing things like world segmentation, matting, lighting, enabling persistence and world mapping and all these different types of things. And so they've been building all this out, I guess, within the context of their 2D mobile market, but Once this comes out, then they're able to start to integrate it with all their existing system and other frameworks. And that's the thing, again, that's really striking to me is that they said during the presentation that this is the first operating system that's been built from the ground up to be able to handle spatial computing. And everything that Meta has been doing has built upon Android. I really take Apple's claim seriously because, you know, Meta actually tried to create their own operating system, but they failed and basically said, this is really difficult and it usually takes five to 10 years anyway to build an operating system from scratch. So it's a long, hard road and Meta decided to go back to Android, but. Apple's actually been able to create their own operating system across all these different operating systems and devices and platforms. And so the fact that they have so many different tightly integrated aspects for their vertically integrated system and for all these frameworks that have all these different devices that have this full range of different developers, it's going to be a lot easier for them to add different spatial components than it is to have people learn an entirely new pipeline. So as much of us regret having to maybe learn a new pipeline, people may be from the XR industry entering into the Apple ecosystem, or maybe people just end up using Unity and call it a day once they get their app integrated with all of the different exports that are coming out of Unity. But yeah, this is, I guess, an opportunity for folks to maybe get a little closer to the metal, I guess, as folks are designing their applications. Yeah, I'd love to hear any other reflections on that fact as this is such a robust ecosystem, but also a variety of these different platforms and operating system implications of these tightly coupled vertical integrations.

[00:31:59.879] Sarah Hill: Sure, the vertically integrated ecosystem is right. Not just from the App Store, but Apple Watch. And in talking with some of the engineers after the keynote, that's what they're really excited about, is the ability to integrate everything. We were asking about, will it have a separate app? So with some VR headsets right now, you have to have a separate app, right? And that's how you can download and navigate content. And they integrated this. This would be a completely standalone device. So no other external device would be needed to operate it. So that was kind of interesting as well.

[00:32:38.041] Raven Zachary: Yeah, I mean, they're really patterning the app distribution model after the App Store. So if you think about how an app is distributed now, it's you test and beta test through a tool called TestFlight, which is Apple's way of doing private beta distribution to internal customers and your audience. And then you promote that and get it approved and reviewed by Apple. There's no reason to say that you couldn't build, as a developer, third-party hand controller support, but whether or not that actually makes it through the approval process for the App Store, we'll have to wait and see what the guidelines are going to be. Apple hasn't even published yet their Vision OS Human Interface Guidelines document. I went to look for a copy. It's not online yet. I'm looking forward to reviewing that and understanding how they look at gestures and eye tracking and other aspects and where we're being asked. to accommodate users and where are we being perhaps asked to avoid doing things differently.

[00:33:35.267] Sarah Hill: One other thing to watch that they had mentioned in the platform's talk was about any third party SDK integrations and some new rules for signing the signature to ensure that it was actually that developer. So anyway, that appears that there are some new rules that are rolling out as it relates to the manifest and signatures.

[00:33:59.308] Kent Bye: But yeah, the having new updated human interface guidelines, they said they were coming for Vision OS. So yeah, Apple's very strict with their design guidelines. Raven, you've been dealing with Apple for a long time. Maybe you could describe, you know, because they have these vertically integrated systems, they've been very specific with trying to create this cohesion and their design guidelines have been pretty brutal in terms of like, you have to follow them in order to even get things approved. And so maybe give a little bit more context is the history of that. Cause as Apple's entering in, they've had pretty strict standards for people to live up to certain expectations.

[00:34:34.480] Raven Zachary: Sure, and that's mostly for their native SwiftUI types of apps or UIKit types of apps. But if you're building a Unity experience, which is more game-like, there's a lot of flexibility about how you do custom UI implementation. So I would say if you want to do this natively using the SwiftUI components and ARKit and these other things, you're probably going to have a greater level of oversight than if you were to build this using Unity. And Apple were treating your app more as a kind of a self-contained game-like environment at that point.

[00:35:09.062] Kent Bye: Yeah. And I did, during their platform's State of the Union, they had a whole section on accessibility, which, you know, they listed a couple of dozen features for accessibility that, you know, I feel like because they've been doing a lot of accessibility work when it comes to their existing platform and they're really industry leading in a lot of ways, the fact that that's going to be coming into XR, I feel like is really exciting. You know, they were saying that you know, subtitles and guided access and dwell control, braille support and pointer control. Plus everything's being controlled by your eyes. And so it's already going to be an assistive device in some ways. So yeah, that was something that I was really paying attention to in terms of just noticing how much they were taking a care of some of the values like accessibility, I think is going to be a pretty significant thing as they enter into the ecosystem.

[00:35:57.571] Sarah Hill: Yeah, specifically because, you know, a good chunk of the user base doesn't have the ability to physically travel. So this is their travel. And I think that's absolutely a smart move in that we are reducing the physical space between us. Right. And so who needs that? People who lack mobility or lack access to traditional modes of transportation either from an ability perspective or from a financial perspective as well.

[00:36:28.942] Kent Bye: Yeah. And as I was watching it and going through looking at the photos that I took from my Twitter thread, if you have an iOS phone, then presumably there's going to be very tight integration to be able to go through and look at your photos, your videos, and maybe these new spatial interfaces, panoramic shots. And so we've had these VR headsets, but we haven't had them be so tightly integrated to be able to just pull these things up. And there hasn't been a lot of apps to actually even look at all of my photos that I have. without somehow syncing them to some sort of third party device. So the fact that you could potentially have that as a seamless integration to go through that, I feel like that's a, an aspect of memories. And I know Sarah, you've been talking about memories for a long time, but I'd love to hear any reflections on what it means to be able to look at your media and share media in this new way.

[00:37:18.335] Sarah Hill: Yeah, one of the things they showed was a 3D capture of an experience, right? That you could then go back in and be inside that birthday party. No external camera involved, just in the headset, recording it from your headset, and then playing it back and being able to share it over and over again. And so I thought that was a really fascinating way that with the ecosystem, it would be more seamless. And yes, you know, to me, that's the ultimate best use case of virtual reality is to be able to go back into the old memory. We've chatted before, I lost my home to a tornado many years ago, my childhood home, I could hear the creak of the wood floor. you know, the way it sounded when I walked up the stairs, and to be able to go back there, you know, what I wouldn't do to be able to go back there. And now, you know, it can be assembled with photos, essentially, as well as 3D capture on the device.

[00:38:19.653] Kent Bye: Yeah. Any thoughts on that, Raven?

[00:38:21.895] Raven Zachary: Not on that in particular, but a point you brought up earlier around integration with phones. I did ask one of the engineers on the team, can you read the text on your phone in front of your face if you have the device on, which has been a problem for me with the WeatherQuest Pro. You know, I can't even read the text on my own phone through pass-through if I get a phone call while I'm using a MediQuest Pro. And the engineer said, yeah, absolutely. You know, he could look at his phone and see a text message just through the pass-through. So that itself is impressive. Beyond that, of course, integrating where you have all of your messages and content being displayed, you know, that's even more impressive on top of that. But yeah, I mean, the pass-through quality is good enough that I can multitask with multi-devices at the same time.

[00:39:08.008] Kent Bye: Yeah, I was at AWE and looking at Vario, and even at Vario I could read my phone. And yeah, like you said, a MediQuest Pro or other Lenovo headset that was there, I couldn't read my phone text at all. So yeah, having a certain level of camera quality, but also the resolution fidelity. I'm going to be talking to some XR journalists tomorrow to be able to get their hands on reports because apparently as you were there, you only got to walk by it, but not actually try it out. But did you have a chance to get any other feedback from people that may have been able to try it out? But I'm curious to hear either that or any other buzz as you were on site getting feedback as to what the discussions were about as people were talking about this.

[00:39:47.287] Sarah Hill: Yeah, just as far as resolution and it relates to your pass-through camera, they mentioned that don't confuse the resolution of the fully immersive experience with the resolution of what comes up in the pass-through. So apparently, you know, there is also a difference in that in pass-through in the Apple device, much as there is in our current devices. And that, you know, you've heard words like world-class, game-changer, This is what we've been waiting for. But to me, the game changer is the ecosystem, the vertically integrated ecosystem that's built into a distribution center and a bunch of other devices that aren't just immersive apps, but the ability to build mobile companion 2D apps in a way inside an immersive space. I mean, they are really blurring the lines between AR, MR, and VR.

[00:40:46.994] Kent Bye: Yeah. Qualcomm was showing off dual render fusion at AWE, which you're in a augmented reality experience, but you're able to see another secondary experience on your phone. And they didn't show anything explicitly like that, like using your phone as a controller, but I imagine that that could actually be a possibility. I imagine they wouldn't want to tie it too tightly coupled to their existing ecosystem, like you would only be able to have this interface if you have an a watch, or if you only have an iOS phone, because people may have Android and maybe they're on PCs. So yeah, it'd be interesting to see what kind of the cross compatibility talking to Brandon Jones, he was emphasizing that there's been some legal beef between Apple and Kronos Group. And so Apple's not going to be supporting any of the OpenXR standards. So for anybody that's been creating these kind of interoperable peripherals, I don't expect any of these are going to be working with the Apple device. It's going to be a fairly closed ecosystem. So as vertically integrated as it is, it's not going to be open to other third-party devices, as far as I can tell, at least in the short term, if they don't have something like OpenXR, but yeah. It's certainly a device that people, when they do get it, I think that that overall cohesiveness of the experience, you know, it's got this M2 chip, which I guess is basically what is in the Mac Pro. But the fact that Apple is creating their own Silicon and they have this new R1 chip that is handling all the computer vision and all the sensor inputs. So they have a whole separate chip. That's handling that on its own, right. And then just taking what sounds like is going to be just the chip that's out of the laptops. And so I don't know what the specifications of that are going to be, but I feel like that's going to be a pretty beefy processor that's in this thing that is going to be maybe above and beyond whatever. Qualcomm is going to release with their next generation. I mean, that's going to be announced probably around Oculus Connect, but yeah, I don't know if you have any insight. I mean, they didn't talk, they had no specifications or anything like every specification that we've had was all reported by folks like Mark Gurman and other folks from Bloomberg who have reported on some of the specific details. But during the actual talk or anything after that, it was very light on any specifics about what this device capability actually is. But I do feel like the fact that Apple has their own silicon is significant just to point out that that's going to be one of the things that's going to be differentiating for the type of power that this is going to have.

[00:43:10.229] Raven Zachary: Yeah, speaking of power, I mean, you know, Apple also announced a new MacBook Air today with an M2 chip that they said was 18 hours of usage. You know, the Vision Pro, two hours of usage with the battery pack in your pocket. That should be an indicator of how battery intensive this new Vision Pro device is. If they have two M2 products the same day and there's a 9X difference in battery life, you know, those cameras and those screens in that Vision Pro are very power intensive.

[00:43:43.519] Kent Bye: Yeah, well, I'd love to get some final thoughts and then we'll start to wrap up or just some some like takeaways of what are some stuff that you're taking away from today after being there on site, being immersed into this whole moment. I feel like it is a bit of a turning point. Javier Ferdul, who told me at AWE, I love this. He said that this is the end of the beginning. meaning that we're been in the beginning phases of XR and that we're entering a new epoch with Apple entering into the chat metaphorically, as they have this device that's coming out, that it does feel like it's a bit of a sea change within the XR industry and will likely draw other companies, other competitors to come in once they see Apple has made their statement about what they're doing. I feel like we're entering in a new phase. So it does feel like today's a bit of a turning point. So I'd love to hear reflections of what it was like to be there.

[00:44:34.180] Sarah Hill: It was almost as like, you know, with current media now that there's no differentiation between a flat image and a video, right? It's just media. And I think seeing what was announced today was really further blurring that line. And it injected great enthusiasm among the creator community. I'm sure you've on the message boards, you know, a lot of chatter about how these tools can be used. And again, that vertical integration with that ecosystem and existing hardware devices and the health kit, share play, all of that is going to make it easier to use. And I think that's what a lot of companies and creators are excited about.

[00:45:19.269] Raven Zachary: Yeah, the advice I would give to an AR VR developer, especially one who hasn't had any experience building and deploying apps on platforms by Apple, is to talk to an Apple developer because the community is different. And, you know, there's a lot of, I think, negativity out there in communities about Apple simply because they've never had an experience working with Apple before. And my recommendation to AR VR developers is to try to understand what the Apple developer community is like and adapt to that. Don't expect Apple to adapt to you. So this is a trillion dollar company with a record successful product. And AR and VR developers need to put ego to the side and figure out how they adapt their great creations for the Apple developer community in a way that's to Apple's wishes, not to the AR VR community's wishes. And there's an opportunity here, but it's going to require a mind shift.

[00:46:26.705] Kent Bye: Yeah, I'd love to, to provide each of you, if you have any ideas for what you would like to do with the Apple vision pro with vision OS, if you have anything that you would like to get your hands on a developer kit, if there's anything you'd like to share about what you're personally looking forward to potentially working on for your own efforts. I know it may be a little bit too early to share, but I'd love to hear any other thoughts for where you want to take this particular platform.

[00:46:50.903] Sarah Hill: Just supporting our paleomaps, you know, the ability to control the media with your brain patterns, with your heart rates, and that you aren't just passively watching it, you're actually feeling it. That's what we're excited about.

[00:47:06.273] Raven Zachary: Yeah, I'm, you know, with around, you know, we're looking at the future of fan engagement. So we're very curious about how this affects what we're doing in that space. I don't want to get into any details specific to the vision pro on this call, but I think in general, I'm interested in seeing developers. push the limits beyond 2D. Apple is telling a transition story at WWDC this week to get iOS developers to transition 2D apps into a 3D space. And I think there are opportunities here actually for the AR and VR community to lead by going direct to 3D. and showing some amazing future opportunities that aren't taking existing 2D apps from the iOS community and actually building new apps for the platform.

[00:47:54.321] Kent Bye: Awesome. And I'd love to hear from each of you what each think the ultimate potential of XR, VR, AR, or spatial computing might be and what it might be able to enable.

[00:48:05.673] Sarah Hill: the ability to see your feelings and interact with them, have a relationship with them and see them spatialized in a story.

[00:48:16.360] Raven Zachary: Yeah, for me, it's always been focusing on the presence in the physical reality and how do we use technology to enhance physical relationships and physical interactions because that's the human experience and we're not there quite yet. The potential is going to be, you know, something much more the size of sunglasses or a pair of regular glasses that would allow me to augment and enhance my physical daily life using data and information. It's an inevitability. It's just going to take. Perhaps a longer amount of time than we had all hoped for.

[00:48:50.782] Kent Bye: Awesome. Is there anything else that's left inside that you'd like to say to the broader immersive community?

[00:48:57.367] Sarah Hill: Let's go.

[00:49:01.820] Raven Zachary: Yeah, I think for me, it's, you had over a thousand, I don't know how many people were physically at Apple park today, but it was certainly a lot. Very enthusiastic folks from around the world who are in the existing Apple ecosystem. If you're not in the Apple ecosystem now, find somebody who is, and try to understand that and apply your excellent AR and VR skills into bringing that into the Apple ecosystem.

[00:49:27.968] Kent Bye: Awesome. Well, Raven and Sarah, thanks so much for giving some on the ground reportings of the big event today for Apple announcing the Apple reality pro and the vision OS and this whole new realm of spatial computing that we're entering into with Apple entering into the XR industry. And so yeah, excited to see where this all goes and yeah, I hope that there's some listeners that are able to get their hands on some of the hardware and make some really amazing applications and also look forward to what each of you are able to produce with this new system. So. Thanks for taking the time. And I know that you're in transit and on the move. But thanks for the on the ground reporting to fill my own personal curiosities and to help unpack what happened today. So thanks for taking the time to join me today.

[00:50:11.734] Sarah Hill: Yeah, thank you. Appreciate your important voice in the community. So keep going, everyone.

[00:50:16.650] Kent Bye: So that was Sarah Hill. She's the CEO of Helium, which is a meditation application that is available on Quest via App Lab, as well as on some of the other devices as well. And then Raven Zachary, he's the chief operating officer at Around. They do augmented reality experiences for live events. So I have a number of different takeaways about this interview is that first of all, Well, I think the thing that came up again and again is just the ecosystem. I think the thing that is the differentiating factor is that Apple is literally the biggest company in the world. They have people that use their devices all day, all the time around the world, whether it's phones or laptops, watches, iPads, tablets, all these folks have all these applications that they use across all these different contexts. And so the People are already using these applications. And so just to being able to bring some of these into a spatial context and start to bring them together in different ways. I did a whole interview with Pluto VR that was talking about how you think about these windows of these 2d apps, but once you start to bring them into these specialized concepts, it's going to stop being just this 2d plane and start to have them interacting as these 3d objects within a space. And how are they going to start to actually interact with each other? That's like way down the road. And we're just at the very early beginnings of this and. I think Raven was really smart to identify that Apple is really just trying to grow their existing ecosystem and to take all their existing developers and to expand into a new platform rather than to think too much about how to bring all these other existing XR developers into their ecosystem because they're not providing a lot of those existing XR developers with the core functionality that we need with sophisticated hand track controllers with an input scheme. So there's going to be a whole new innovation that has to happen with what are the different types of gestures and UI mechanisms to be able to interface with this controller lists system that is being directed by the eyes. And I actually have already had a chance to talk to Ben Lang at the time of this recording, I hadn't chatted with him, but now it's the next day that I was editing all this together. And I reviewed a lot of the different hands-on reviews that have come out and I added them to all my Twitter thread if you want to go check out some of those other hands-on impressions. But I'm also just really particularly interested in what the XR folks have to say because they've had a lot of a lot of experience across all these different applications for many, many years. And so they have a lot of tested time and curated sense of being able to discern all the nuances of these different platforms. just to hear from Ben just the integration of all these eye tracking controllers to be able to use both the gesture interfaces and some voice control but mostly this combination between the eyes and the hands where you have a non-fatiguing interface where your hands are just on your lap most of the times of stuff like Collins you'd have to have like a This clicking that you'd have to put an L with your thumb and your forefinger and click very deliberately But this is just on your lap and it's got all these cameras to be able to detect this So this is a whole new type of user interface and there's gonna have to be new affordances to be able to have the same type of functionality with these existing types of 2d applications they already have a lot of that existing menu system structure that you're gonna have to go through and so I It'll actually be quite interesting to see how robust it is just to use your eyes in these little click gestures to be able to replicate what you have with the mouse and clicks and the keyboard and stuff like that. Like, how productive can you actually be when you get down to it? And are most people gonna need to hook up a keyboard and a mouse to be able to really do productive work? I guess it kinda depends on what type of applications you're using, but that's a bold statement for Apple to come out with their vision that is a controller-less system. So, yeah, I think Raven's points around get to know other iOS developers and get a sense of how to work and think about working with Apple. There is the Unity integrations. And so if you do want to get stuff quickly into their ecosystem, then you wouldn't necessarily have to go through their existing. human interface design guidelines, which is going to have a whole new set of different guidelines for Vision OS. So that's going to be their foot in the ground in terms of the different types of design guidelines. So I'm really looking forward to seeing that because Apple have created this consistent user experience across all their different ecosystem because of these strictly enforced human design guidelines that have really been a foundational part to their app ecosystem philosophy. Android tends to be a little bit more fragmented because there isn't these consistent guidelines and they're not enforced in any capacity. So we're going to be getting a whole new leveling up of what it means to design these spatial computing applications. And yeah, I'll be digging much more into the experiential dimensions with Ben Lang, but really appreciate both Sarah and Raven coming on to be able to share some of their first takes and yeah I'm super excited to see where this goes and I do think that it's going to be a key part of leveling up the entire industry. Being at Augmented World Expo there's a ton of people who are super excited about what's going to be happening with this headset and a lot of people super excited to get their hands on it and it's going to bring a lot of other players into the ecosystem. Folks like Samsung collaborating with Google as well as Qualcomm And maybe Microsoft will make another entry, although they did show some early indications of the whole Microsoft suite is going to be available on both the Apple and the meta quest headsets. And so maybe Microsoft rather than trying to do their own hardware, they're just going to be focusing on trying to bring these enterprise software solutions into these Spatial computing platforms whether it's the Apple vision Pro or the meta quest Ecosystem with the quest 3 and with the meta quest Pro I guess this is actually my first podcast since the meta quest 3 was announced during augmented world Expo I was off recording 23 odd interviews at augmented world Expo that's going to be a whole other deep dive of the intersection between artificial intelligence and XR and lots of excitement around the Apple vision Pro lots of speculation but I'll be digging into that. However, I'm going to be getting this interview out. I have an interview with Ben Lang and I may or may not have another interview scheduled tonight to be able to dig into a little bit more of some of the hands-on impressions for the Quest Pro. That's all to be determined. But I'll be hopping on a plane tomorrow morning on Wednesday to go to Tribeca for a week. I'll be covering all the Tribeca Immersive experiences, 13 of them. And there's an accessibility conference that's happening in New York City. If you happen to be in New York City on June 15th and 16th, I highly recommend checking out the XR Access Symposium that's happening there in New York City. Super excited to see what was announced with Apple Vision Pro because there's a lot of accessibility features that are already built into iOS and a lot of those are coming into the Apple Vision Pro. They had a whole slide of the dozens of different accessibility features that they're already starting to think about. So super excited to see that Apple has their values of really focusing on that That was during the platform State of the Union not in the main keynote But they are at least emphasizing that to developers as a key feature to the next spatial computing platform and paradigm as we move forward. So By the way, spatial computing, originally coined, I believe, by Magic Leap, that was something that they have been pushing, and that's something I've adopted just because I think that's a little bit more encompassing of what I do here on the Voices of VR, even though it's called the VR Podcast. Apple never said the words virtual reality, if you go back and listen. They said augmented reality at the very beginning, but beyond that, they really emphasize this concept of spatial computing, which I think is fair and encompasses all these other things, but there is this antagonism of, Not mentioning things like virtual reality. They never mentioned the metaverse. They never even mentioned things like artificial intelligence They had very specifically would say machine learning so very deliberate in the way that they use some language But there is a little bit still a bias against VR I'd say within Apple and even though they have probably one of the most sophisticated and complicated VR headsets that's out there today. I'll be looking to see how things continue to develop for the platform. They have the Unity integration, so I expect to see some innovations that are happening there. And yeah, like I said, without explicit controllers, very curious to see what folks are able to do. And I'll be looking to some of the prototypers at Apple, as well as at Unity, as well as different folks out there who are trying to Figure out this next paradigm. I do think this type of combination with eye tracking in hands It's a bold move forward and maybe we've been relying upon these crutches of the hand tracks controllers There's certainly going to be gaming applications that absolutely require those hand track controllers, but we're into this whole other paradigm That's exploring this dimension of interactivity productivity Interacting with people and yeah, I think it's going to open up a lot of new use cases and possibilities so Anyway, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a necessary part of the podcast, and I really do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show