#1145: Wrap-up of Meta Connect with 4 Immersive Journalists: Meta Quest Pro Impressions & Meta Reality Labs Research Demos

Meta Connect happened this past week on October 11th where Meta announced the Meta Quest Pro, and also released quite a lot of interesting information about their future strategies. I participated in a panel discussion with 3 other immersive journalists covering this space on the No Proscenium podcast episode #363 hosted by Noah Nelson (@noahjnelson) along with CNET’s Scott Stein (@jetscott), and LA Times games & themed entertainment journalist Todd Martens (@toddmartens). This Voices of VR episode is a cross-over rebroadcast of the NoProscenium podcast #363, but with some additional context and thoughts that I’ve added at the beginning and end.

[Be sure to check out the Denver Immersive Gathering Nov 4-6, immersive theater and VRChat world listings at EverythingImmersive.com, and Support the No Proscenium Patreon and Voices of VR Patreon]

I have not have a chance to have any hands on with the Meta Quest Pro or new controllers yet, and so I scoured the web to try to gather all of the hands on reviews, new interviews, and announcements that were made and synthesized it all into this Twitter thread here.

Stein did a whole hands-on review of the Meta Quest Pro, but also had a chance to try out a bunch of Reality Labs Research demos on a special invite-only field trip to Redmond, Washington. He shares a lot of his personal impressions, including that the Meta Quest Pro reminds him more of the HoloLens 2 or Magic Leap 2 with it’s emphasis on Mixed Reality that uses the stereo passthrough, color cameras to bootstrap AR experiences.

At a steep price point of $1499, the Meta Quest Pro introduces a new enterprise-focused product line to complement the consumer Quest 2 with eye tracking and face tracking and new controllers. Meta has plans to release a Quest 3 (likely by Meta Connect 2023 as reported by SadlyItsBradly here and here).

Meta CEO Mark Zuckerberg has been emphasizing a commitment to cultivating an open & interoperable XR ecosystem (both in their keynote and in interviews here and here) that is in direct contrast to Apple’s more closed & vertically-integrated ecosystem. Meta was not always living into the full potential of cultivating an open ecosystem, especially in it’s early Facebook/Oculus era where they showed much more evidence for wanting to replicate Apple’s closed walled garden ecosystem that was in direct contrast to Valve’s more interoperable approaches with SteamVR. But they are showing a lot more positive signs of living into their aspirations of an open ecosystem with their latest partnership with Microsoft announced during Meta Connect, their ongoing collaboration with Qualcomm on future generations of XR2 chip design, OpenXR work with the Khronos Group, participation on the Metaverse Standards Forum, and leading edge work with implementing WebXR in their browser with more progressive web apps showing up that they are living into more of an actualized open and interoperable ecosystem.

The limits of that interoperability will likely come when enabling interoperability will conflict with their own first-party app aspirations as they’ve shown in the past with examples such as BigScreen VR for showing movies, Virtual Desktop for game streaming, and YUR Fit for Fitness tracking.

Meta still has a disproportionate amount of power in deciding which apps are on their Quest app store (around 440 of them at my last count), and which of the more than 2000 apps are relegated to the App Lab store that are hidden from official search results.

So I’ll be playing close attention to whether or not Meta allows other systems to have an interoperable avatar systems, or if they only allow their avatars to be interoperable.

But be sure to tune into this podcast to get four different perspectives from immersive critics on all of the major the Meta Connect Announcements, the pivot back into enterprise that Meta is aspiring to make, impressions of the latest hardware, and their latest strategies for moving us into the Metaverse as we approach the one-year anniversary of Facebook’s Meta rebrand.

This is a listener-supported podcast through the Voices of VR Patreon.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So this past week on Tuesday, October 11th, 2022 was the META's annual conference of METAConnect, where they announced the METAquest Pro. In the era of the pandemic, they have not had their physical gathering. As a journalist, I was not able to have any hands-on to the MetaQuest Pro. There were some journalists that got sent to different locations to get some hands-on, and there was also a smaller group of journalists that were sent up to Redmond, Washington to go to the MetaReality Labs research to be able to get both some demos of the MetaQuest Pro, but also to see some of the forward-looking frontier research for the future of spatial computing. Because I didn't have any hands-on experience, I was devouring all the available news media, watched all the different reviews, and digged through all the press release and watched the video, and just was scouring as much information as I could. I did a whole Twitter thread where I tried to point people to all the different information, because there's essentially a giant dump of new information that happens each year during the MetaConnect, and it's an opportunity to get a sense of what the latest strategy for Meta is, what they're thinking, where they're at, where they're going, you end up seeing a lot of new information. And so I've just been really digesting that information and trying to get a sense of where things are going. I try to not attach to any specific narrative as to what's happening, what's unfolding. I take a little bit more of an empirical approach of just watching what is happening and what is going to be emerging. So the MetaQuest Pro is going to be launching on October 25th, and that's going to be hands-on reviews. I'm waiting, honestly, to see what are some of the different use cases that are going to make it interesting for me. At this point, I'm not even going to buy it. It's $1,500 and it's out of my price range. Yeah, I guess that's a good segue to like, if you'd like to support me in the podcast, and please do go to patreon.com at voices of VR, I'd love to get at a much more sustainable level to really not even have to think twice about getting things like this. I think I'm at the point now where I really want to see some hands on reviews and just get a sense of What are the use cases in terms of how good is the resolution to be able to do on-screen work? Is it at a higher resolution? It's a slight bump in terms of different optics, different clarity, and the text, according to some of the people that were doing it, was easier to read. But is it easier to, say, sit in front of a computer for many hours in a day and to do work sessions? And there's also the battery life issue. It's only around one to two hours, which is not necessarily a great amount of time. But if it's tethered, is it comfortable enough to be working as a tethered experience and and still working and are you able to have a more extended work sessions in that way? I think the face and emotional tracker is going to be some new capabilities for social communication, a big reason why I don't do a lot of podcasts, in social VR is because there's a lack of emotional expression, a lack of eye contact. You kind of lose a lot of that deadness, and it doesn't feel as real. I'll be really curious as things go forward. Is the social connection that you get from social presence, does having emotional expressions on top of the eye tracking, does that take it up to the next level for social presence, opening up all these new realms for remote work or allowing people to feel like they actually have this sense of co-presence with other people in a social dimension? Some of those things that I'm seeing more in these enterprise plays, as Meta is pivoting into these more open ecosystems and collaborating with Microsoft, is that something that is going to be really driving the success in the future of their company by having this dual track of the consumer Quest and then the Quest Pro line that is going to be much more on the enterprise track? That allows them to get out new technologies and prototype and push forward the overall industry in terms of where it's going in the future. So I had a chance to kind of digest and unpack it with a number of other immersive critics, one of which Scott Stein, who was from CNET and actually had a chance to get some hands on with the MetaQuest Pro and to go into the MetaReality Labs and get all these demos that he has the opportunity to share in this conversation. as well as Todd Martins, who is a games journalist for the LA Times, as well as Athene Park Entertainment. And this whole conversation was actually facilitated and moderated by No Proscenium's Noah Nelson, which the No Pro podcast is following the immersive theater and a lot of the storytelling applications of this cross-section of immersive theater and these immersive technologies. And so I highly recommend to go check out the No Pro podcast, where Noah is moderating this conversation, but I recommend going listening to the first introduction that he has, because he's talking about a gathering that he's having for the Denver Immersive Gathering from November 4th and 6th. Also, he has got Everything Immersive, which, if you were a fan of VRChat, I know Jen Davis-Wilson and some other folks in the VRChat community have been doing these monthly updates in terms of, like, the best new worlds that are in VRChat, so definitely check out Everything Immersive for some world-hopping recommendations from folks like Jen Davis-Wilson, a.k.a. Fiona, and others within the VRChat community. And Noah has his own Patreon that he has for the No Proscenium podcast, and so definitely support him over there at the patreon.com slash No Proscenium. And like I said, I'm also Patreon supported, and I'd love to get my Patreon support up another level. I've been thinking a lot about doing some more in-depth breakdowns of different immersive experiences, and I'm over at the patreon.com slash Voices of VR. So that's what we're covering on today's episode of the Voices of VR podcast, which is actually a rebroadcast of the No Proscenium podcast. So this interview with Todd Martin, Scott Stein, myself, and moderated by Noah Nelson happened on Thursday, October 13th, 2022. So with that, let's go ahead and dive right in.

[00:05:50.015] Noah Nelson: As you know from the top of the show, this week is all about MetaConnect. Specifically, we're going to be diving deep on what was announced at the keynote. And to do that, we've brought together some of my favorite journalists in the space to share their notes, both from the keynote itself and from what they've been seeing behind the scenes. Some of them have gotten to play with some of this stuff, which is very exciting. Joining us today are

[00:06:18.736] Scott Stein: Scott Stein, CNET.

[00:06:20.938] Kent Bye: My name is Kent Bye. I do the Voices of ER podcast.

[00:06:24.341] Todd Martens: And I'm Todd Martins. I write about video games, interactive entertainment for the Los Angeles Times.

[00:06:29.623] Noah Nelson: Alright, we're going to follow sort of the arc of Mark Zuckerberg's keynote, because why not? And the first thing I wanted to note, and I think this is probably why I'm following the arc of the keynote, was we saw at the beginning of this event, both the keynote and by extension the whole event, that there was a little title card at the very beginning that was basically saying, hey, we're going to talk about stuff that are not wild projections, but it's speculative. So, if perchance you're a shareholder, don't make any bets on this. That's the subtext of what they were doing. And I thought that was really interesting, because it sort of framed everything that was to come as, this was a presentation for shareholders, as opposed to being, this is a presentation for consumers. And yet, of course, because it's all being done in public, because it's being streamed, a lot of it sort of hit everyone as a consumer show. That's how people tend to interpret it. Not everyone's bringing their stockholder game. But it really colored how I was viewing everything. And I don't know, did it do the same for you guys? Before we crack into the details, what was your overall impression of this MetaConnect this time out? Scott, let's kick it off with you.

[00:07:44.100] Scott Stein: Yeah, well, it's weird because I got to do a field trip where I got to see some things. And that was interesting. Then when I actually watched the presentation, it felt very dull. And I thought, oh, based on what I saw, I wouldn't have told the story this way. But I don't know if part of it was a reaction. Last year when the metaverse goals were unveiled more, there were so much pie in the sky animated stuff. What I took away was there were a lot of human beings this time. It was a lot of full-body people walking down hallways. It felt very deliberate to me. For whatever reason, maybe they wanted to get more real, kind of show themselves, but it felt like a total flip. I think it went on too long in that regard where it kind of deadened the feel of it.

[00:08:39.965] Noah Nelson: Todd, how about you? What was your vibe off of it?

[00:08:42.673] Todd Martens: Yeah. I mean, I think I sort of struggled trying to figure out who they wanted the audience exactly to be for this, because it wasn't a very heavy sort of consumer play. They did start out with games, but very quickly sort of moved on from games and sort of made the pitch to businesses while giving us a little bit of a glimpse of some future tech that I would love to hear those who have downloaded it talk about. But I think just from somebody who covers interactive entertainment, I think You know, if you were looking to the MetaQuest as a games device, an entertainment device, there wasn't a lot for you to sort of chew on here.

[00:09:18.095] Noah Nelson: We're going to drill down into that just a little bit in a second. But before we do that, Kent, what was your general vibe off of the keynote?

[00:09:26.059] Kent Bye: Well, this is my ninth event starting with the Oculus Connect. And so this is kind of evolution to now the MetaConnect. And so this year was the most enterprise focused of all the previous Connects that I've seen so far. And, you know, there's Simon Wardley. He talks about this evolution through different phases where there's a genesis of an idea. Then you go into a custom bespoke enterprise application and then you go into the consumer market and then it gets into mass ubiquity. But in some sense, Meta has kind of skipped over the enterprise. section for many, many, many years. They've ignored it and they've had something for a while, then they actually killed off their quest for business. So they've kind of been trying to do this pure skip straight to the consumer play by dumping billions of dollars into creating the market. And I think I see them taking a step back and trying to actually shore up their business and to actually engaging with enterprise business My impression is that they've kind of failed in their previous iterations. And now they're humbled in a way of trying to take a more open approach by perhaps collaborating with other big partners like Microsoft that was announced. So that was my take is that they're maybe taking a step back and their stock price is getting crushed. Maybe they're trying to shore up their business of maybe not trying to skip straight to the consumer market, but actually have a viable enterprise market. And I think I have questions over how well they're going to do with that based upon their past behaviors, but I think it's yet to be seen.

[00:10:48.491] Noah Nelson: I'm on a similar page to you there. I really like this idea of, they tried to leap forward, now they're stepping back. They're do-si-doing a lot here. Let's follow the arc of this, and let's start with games. Todd, I'm going to let you take point here. I was sort of shocked at how little they showed in games, because so far over the past few years, we've kind of been conditioned to expect some announcements some significant announcements as part of this particular show. What was your take?

[00:11:21.170] Todd Martens: Yeah, I was definitely surprised. I mean, especially since we've heard that some of these games and some of these apps have been doing pretty well. I was kind of expecting a little bit more to lean in on that area. I think most people in sort of the general consumer sort of audience consider the Quest, you know, a gaming device. primarily. So I was expecting, if not games, even a bigger presence on health and fitness, which they did touch on too very briefly. And that's honestly primarily what I use my quest for these days, more than games even. So it was very kind of quick overview of like, here's a couple of things coming, here's a couple of updates to some things that you already know about. They announced the release date for Among Us VR, I believe that was early November, November 10th, I think they said. And I think that looks like a very fun experience, sort of being an eye level in that cartoonish light sort of inviting world. I'm looking forward to that. I'm looking forward to competing in that. I think it'll be approachable sort of device game for a lot of people. And then they also touched on a little bit

[00:12:26.018] Noah Nelson: of iron man vr coming to the quest i think that was what they led with but that game has already been out for the playstation vr yeah that's a port right support correct like it felt like the more significant thing was them saying like hey we bought camouflage correct by the way right yes

[00:12:43.134] Todd Martens: So yeah, the announcement basically, they acquired this on another team. So yeah, but not necessarily hinting at anything they're working on or looking forward. The Iron Man experience, it was well done. I haven't used my PlayStation VR in a long time since I upgraded to the PS5. So that's That was one thing they announced. And then we got some updates to some other stuff. A quick look at a very teaser of a game called Behemoth from the Walking Dead Saints and Sinners team, I believe, but not really any information on that. Just sort of a mysterious, misty, foresty sort of game. And I think that's sort of a survival sort of VR game was my impression. But yeah, very little. I mean, that was kind of it, unless there was somebody thought there was a big game preview that I'm missing beyond the announcement later that we'll get to. I'm sure, you know, that they'll be bringing Xbox cloud gaming to the meta sort of world.

[00:13:37.796] Noah Nelson: Yeah, I felt when it comes to gaming as a whole, as a gamer, that felt like the most significant thing that they announced was like, hey, if you're an Xbox Cloud person, congratulations. Although, I'll give it a try, but I don't know if that will ever be the primary way that I play my Xbox stuff. for a variety of reasons, but it was sort of just a dull checkmark of like, oh, and here's another device that Xbox Cloud is on. That feels like a more significant thing for Microsoft than it does for Meta.

[00:14:12.524] Kent Bye: I was just going to say that they did mention that Population 1 was going to have some user-generated maps, and they did at the top say that the multiplayer social apps were the highest apps, and they actually gave out a direct shout out to VRChat, which I was surprised because they don't usually talk about VRChat or RecRoom too much. They really try to focus on their own social metaverse apps. But they did say that the most popular apps were the multiplayer apps. And so they also gave a shout out to YouTube VR as having more multiplayer capabilities coming in the future.

[00:14:42.012] Noah Nelson: There's definitely some positioning there going on, but the whole thing felt, to me, because they were talking about how much money was being made, I was more in a mode of, like, this feels like a WWDC Apple event and not an E3 Gamescom consumer-focused event. This didn't feel like a product launch. This felt like, we're going to talk to developers now about why you should be on our platform. And, I felt like they just didn't set up what this event was really going to be for everybody. So, we all kind of came in going like, where's the new stuff? And it was much more future-focused.

[00:15:19.368] Scott Stein: I think to that point, like you said about speaking to developers – and I'm not a developer, so developers correct me if I'm wrong on this – but I expected maybe they would talk more about new tools for developers in gaming. I thought maybe, is there something new and interesting that games would be able to do, especially if they're aiming for more social? In that sense, too, I felt disappointed. I felt it would have been better off maybe not to announce games at this at all, because I think by announcing them and having them feel lackluster, I was more aware of them. Because there are regularly interesting games coming out on Quest 2. All the time. Yeah, but this made me feel like, oh, wow, that wasn't interesting. Maybe just don't talk about it. I'd rather hear about what they're doing to make future games more meaningfully different rather than repeating forms and genres.

[00:16:07.346] Noah Nelson: We'll move on from games in a second to speed bump on fitness, but do you guys feel like by having it be lackluster, by also not punting, by not saying, we're going to have a games event in November or December or January, come back around in January, we're going to have a much more robust event as part of this keynote, that they've opened a door for Sony, who have put a lot of headsets into people's hands over the years, to maybe start running with the crown?

[00:16:37.516] Kent Bye: They have had other gaming focused events where they do all their gaming announcements, where they've announced things like Grand Theft Auto is an example that wasn't mentioned here. So they actually do have a separate event where they do focus more on the games in this event.

[00:16:52.182] Noah Nelson: But they didn't billboard that here, which I thought from a messaging standpoint, from a marketing standpoint, I thought it was weird to not say, Oh, hey, we'll be back around. You know, put a pin in it. Here's a couple of notes, but we'll come back around in a few weeks. I mean, I think even last year, expressly last year, correct me if I'm wrong. In the keynote, they showed a few things, but then said, hey, we're going to have a games event later in November. And it just feels like with Sony's headset coming for PS5, it just feels like this missed opportunity.

[00:17:24.319] Scott Stein: I felt pretty surprised, too. I got to try PlayStation VR, too, as well in September. And I was surprised when I demoed that, for all that I really liked it, that the games that I got to demo were mostly ports or updates. And it's interesting because it's either there's a chess game going on here, or maybe it's like, getting the ball rolling for new content is sort of being a little strange right now. Because I would expect there are great opportunities for them to push the platform, a lot more graphics, and more, we can get into this later, but more consistent use of eye tracking, because it's interesting what Etta had said about eye tracking on the Quest Pro. But I think that, yeah, I do feel like there's a big opportunity, but then I guess it's a matter of what developers find worthwhile. With the amount of headsets that the Quest has, versus the PlayStation base, but how much of it is worth bringing into VR? I don't know. That's an interesting question.

[00:18:19.639] Noah Nelson: All right, let's slide over real quick to fitness. They did a fitness bump. Fitness has been a big thing on the Quest. I admit, I'm not doing Supernatural as much as I should be, meaning I don't think I've done it in about four weeks, and I feel bad about that. But they did announce that knee strikes are coming, so I'm excited about that. I believe that acquisition's still held up by the FTC, or the SEC, or whichever one of those covers that. They don't own Within yet, if memory serves, but they're on their way to maybe buying it. How did the fitness section strike any of you in particular? Was there anything that stood out? I got a note, but I'll leave it to you guys first.

[00:18:59.682] Kent Bye: Well, they have the active pact is like a peripheral that they're going to be selling. And for me, the fact that they have knee strikes means that there could be the first viable way of getting full body tracking in, in a way that is not occluded. And so I think there's a larger story there in terms of the AI and the prediction for how some of this stuff that with the knee strikes is going to tie into their more full body tracking as they move forward.

[00:19:23.107] Todd Martens: I was mainly wondering about the knee body, if that was something I was going to be able to, was that going to require another, because they didn't really say how that was working and the tech behind that. And so if that was going to require another peripheral or if that was something that was just going to work in the headset by sort of guessing our own sort of movement.

[00:19:41.584] Kent Bye: There's some AI papers that are out there that were from Reality Lab's research, which basically shows them being able to do full body pose estimation by just the cameras on the Quest. And so I think the AI part of that is on the way. It's just doing it in a way that is able to kind of match people. They said during the presentation that it's going to be coming first to Horizon Worlds But it's probably actually coming first to some of these different fitness apps to bring in more full body interactions, which I think is probably going to be one of the more compelling use cases at the beginning, and then eventually try to do some higher end stuff that has some of that AI research that's been happening mature to the point where it's able to be deployed as a product.

[00:20:24.468] Noah Nelson: It's going to be interesting to watch people figure out like which way they need to tilt their head in order to simulate a knee strike. You know, someone will figure it out real quick, I imagine.

[00:20:35.526] Scott Stein: I was expecting in this particular area a lot more because it's just been such a big part of what MET has had success on with this. I think it's been a very unique landscape. People have really gotten into the idea of fitness on this, but not everybody. I think it needs work. I think the integration with Apple Health was super interesting earlier this year. I expected more per a Fitbit approach. I thought, are they going to really run with this? What else can they do? What else can they enable, thinking of it as a fitness and health platform, whether or not people want that, how they can work in other partners, into that other types of data. I don't know what the answer is to why. It just seemed more like, yes, we do fitness and health. It was not much more. I mean, the FitPack felt just like something you could already get, but they're making it.

[00:21:29.635] Noah Nelson: Hey, you know PowerA makes this, and now we're going to make it too. Right. It's like, got to love it when you're following PowerA. Nothing against PowerA. You guys do solid groundbreaking work all the time.

[00:21:44.152] Scott Stein: That's true. And then, I don't know, extending to like comments made on that Mark Zuckerberg has made the past year to CNET about fitness and health.

[00:21:53.781] Noah Nelson: We didn't get a fencing demo. We didn't get a fencing demo. All that hype two weeks ago, and we didn't get a fencing demo.

[00:21:58.925] Scott Stein: I didn't get a fencing demo either. No, no fencing. I didn't see any fencing. And I just thought there'd be ways that new sensors or other things or some sort of a tracker. I know there's no word of a smartwatch, but there's nothing even like Vive trackers. There's nothing like that. Pico even announced that they were going to be making these tracker accessories for fitness. That's more interesting. So I really felt like, why isn't that here?

[00:22:22.709] Noah Nelson: I mean, maybe some of it has to do with the fact that they haven't closed the acquisition on Within yet for Supernatural. I figure once they've done that, Supernatural is so good that having that be the flagship. I mean, it was already sort of in there as the flagship in this section of the event, but I thought it was also weird that you had something like Kree's Rides to Power, which was a launch title with Q1. being centered at times in the sizzle reel it made the platform feel musty to me like why are we showing off something from a i mean i haven't been keeping up with like what's been being added to creed maybe i'm wrong but that just felt really weird to me that of all the things to show off was hey here's a game from what is it four years ago now and it's still important to this whole fitness thing we've got going on it was just it was odd

[00:23:16.480] Scott Stein: One confusing thing to me is that, and if I just kind of speculate, Beat Saber is such a key universal thing for them, and they keep going to it. They showed the mixed reality demo. I'm still surprised they haven't built off of Beat Saber and added more fitness modes and other things to that. That seems like the platform to build off of. I know they're acquiring lots of other companies, but they could also very well be building out Beat Saber to be doing that, making that their franchise for a lot of different things. It's, again, that question mark I have of, You have that market, you have people who are using that beyond new music packs. How about thinking outside the box with that a little bit?

[00:23:54.740] Noah Nelson: Let's close off fitness here and let's open up what, for me, was kind of surprising in certain dimensions, which was the productivity chunk. We know Meta's been chasing VR for work. They're really excited about it. I don't know how excited I am about it, but they're really excited about it. But it never felt as serious as when the CEO of Microsoft showed up Like, for me, the entire keynote turned on a dime when Satya showed up, because I was like, oh, this is real now. Like, when they started talking about enterprise solutions and locking down the devices, I was like, oh, this is the kind of thing that you see when a company gets very, very serious about pursuing enterprise. Kent, what was your vibe on this whole section? And do you think people will be doing a lot of Excel and VR in the near future?

[00:24:46.078] Kent Bye: Well, I was surprised to see Satya from Microsoft there. And I think it's also a bit of a welcome move because Boz did a bit of a Q&A after the keynote and someone asked about Enterprise. One of the things he said is that they need to collaborate more with independent software vendors, the ISVs. And I feel like the thing that was confusing to me is that, first of all, they started late on Enterprise, just historically, and then they had it and then they killed it off because a lot of this focus that they had on a lot of their pivot towards like privacy policies that are not really all that great for enterprise. And so you end up seeing the entire, like no one that's using VR for medical use cases, for example, as using a Quest because it's just not HIPAA compliant. Very little education applications because it's not FERPA compliant. So then all of a sudden they're announcing all this stuff, but they haven't really launched even their full relaunch of their quest for business section. So I think they said that's coming, but you know, they're kind of launching this meta quest pro in the absence of having all these other things in place. But, you know, one of the other things that Zuckerberg said, both at the end of this as one of their values of moving more towards this open ecosystem as one of their key values, but also in the interview that he did with Alex Heath, was this question as to whether or not open or closed ecosystems are going to win. And in the interview with Alex Heath, he said that with the mobile market, you can see that iOS is a clear winner, and it's a pretty closed platform, and that Zuckerberg wants to have an open platform. And so he's saying a lot of the rhetoric of wanting to have an open platform. But the thing that I keep coming back to is that, you know, there's kind of this conflict of interest that Meta constantly has between cultivating an open ecosystem and developing their own first party applications. So they'll always be sort of talking about the multiplayer social apps, and then they'll always talk about the horizon worlds. or they'll be talking about all these sort of open platforms that they're trying to cultivate for all these enterprise platforms. And then they'll say, okay, the Verizon workrooms that they are working on. So there's this tension between like Meta want to own and control their first party apps, but also enabling the third party applications by having ultimate ecosystems. So this collaboration with Microsoft is interesting because it is a bit of a conceding of some of that power over to other entities, like maybe Accenture taking the lead on some of these things. And hopefully they'll live into that.

[00:27:00.827] Noah Nelson: Or even there's a partnership with Zoom, right? And you'll be able to have your avatar in Zoom, right? It's almost like, are avatars everywhere? There feels like a core play there of Facebook being like, oh, Facebook is how you manifest your digital identity in all spaces online, which kind of runs alongside login, right? Your avatar is your login, is your Facebook, right?

[00:27:25.452] Kent Bye: Yeah, it's those things like that. Like, do they want to try to own the avatar platform? It's a little unclear as to what their strategy is. And maybe that's part of my takeaway is that it's really kind of a muddled strategy. And that's why it's unclear because they don't even are for sure what their strategy is. Like if they were to really have this as an enterprise launch, I would expect it to have like more stuff ready to actually launch rather than say, Oh, we're going to be launching this next year. You know, our quest for business is going to be launching next year, but we're announcing all these partnerships right now.

[00:27:52.231] Noah Nelson: Scott, you got to see a lot on your trip. From what you saw, does the strategy feel as muddled as what Kent was reading from the keynote and everything after?

[00:28:03.298] Scott Stein: Well, it definitely seems like everything that, you know, when I got to try Quest Pro, the overwhelming feeling I got was more akin to Hololens than VR. And I think it's very much the sign of how the demos were constructed, but also where I think this product is aiming for. It seems very much headed for the HoloLens Magic Leap landscape, which is a landscape of AR headsets that nearly nobody has tried. compared to where people have used VR? And possibly is it a more achievable landscape for them? Although it's interesting online, there are all sorts of comments going back and forth about the dangers of using pass-through in the field or other things. There are drawbacks. But going back to what you said, I think that that then with Microsoft then connects a lot with me where And Microsoft seems to be moving into a consumer glasses partnership with Qualcomm. Is HoloLens kind of being sunsetted? I have no idea. But Microsoft wanted to be multi-platform with Mesh. They talked about that last year. And so that seems to be very much not a new move for Microsoft, but another shoe dropping. Meta is accepting that and going to what Kent said. I think the challenge is going to be, I think all companies want to do work with all the things that you have in this world, only on blank. I think Apple does that too. This is the great question of open versus closed, which is it's a yes and, I think, with all these companies, that some things will work until they don't work. Where interfaces I don't know how to say this, but it's like where interface meets identity is where things are getting weird. Because you used to work on a device and just kind of open a bunch of apps and not think so much about your identity, but you are manifesting in things now. That becomes a part of the interface much more. And I don't think companies have really thought that through as much as it needs to be.

[00:30:10.276] Noah Nelson: I think that's a really great point. Privacy issues are tied up into that. Authenticity issues are tied up into that. Do you have control of this document or not? Is someone impersonating you? We see that on the social apps when someone hacks someone's account. Can we really trust it? There are verified accounts in the first place. All of these things are tied up together. Todd, I'm going to slide over to you for a second because I want to get your sense on this whole VR for business, VR for productivity as a whole. I've got a touch of a cynical read on the open versus closed pivot that's happening right now with meta. which is, I feel less like it's a deep commitment on a value level, and more like, hey, you know what story everyone always tells about Apple versus everybody else? It's open versus closed. We know Apple is sniffing around and spent a lot of money to enter this market. Whether they do or not is a big question. I couldn't imagine a world where Apple just passes on making a headset after all that R&D, because they're the richest company that's ever been, and they were just like, we don't want to fire up the factories. But by positioning themselves as, we're the open, because we know Apple is going to be the closed, even as Meta has acquired all of these content studios, all of these game studios, to the point where I'm almost like, are there any real heavy-hitter VR game studios that aren't in-house at Meta at this point? There's probably a handful of them. So, I don't know if I buy it as a great philosophical quest and more as a, here's our marketing story about why we're different from Apple as a preventative measure. Maybe I'm being too cynical. I don't know.

[00:32:07.500] Scott Stein: I think that there's something I see in the landscape, and I see companies like Apple doing this a lot, where they'll partner with somebody to do something, and then later on, they'll make it themselves. Oh, classic. Yeah. I wonder, too, with meta, is that part of it, too? Sometimes you don't have the solution yet, so you're partnering with other companies. Until then, you do have the solution, and then you're not. Or maybe you trumpet your own. I don't know. I just think it's like... But that goes back to the interplay of... What is it? I agree. I'm similarly cynical. It's like you have partnerships, but then companies always want to kind of have their own control over it or their own thing. It's always a strange game.

[00:32:49.758] Noah Nelson: Todd, before we bury productivity and get into the pro, and we're really following the keynote now because it's like half hour in, we haven't even talked about everyone wants to talk about. Congratulations. Uh, do you think we're going to be doing a lot of office work business in the, and I think this really plays into like the use case for the pro as well. Like is VR the future of work, Todd?

[00:33:12.315] Todd Martens: If it is the future of work, I think it's probably a few, a little further out than even Meta is even projecting. I think as a journalist, as a writer, it's probably not the future of my work, but that doesn't necessarily mean there aren't use cases for it and applications for it. I read a lot about theme parks and theme park design, and I could definitely see aspects of that kind of design work and that architectural work and that sort of experiential sort of work. On the flip side of that, I'm sure we'll talk about the price point, but at $1,500, it's not something that I don't see individual employees buying. It would have to be something that would probably have to be bought by the company itself to give to its staffers. So I think at least for a while, it's going to be industries where having a virtual or a 3D model is vital to doing your job. So I don't necessarily see a mass jump to it in the near future.

[00:34:12.545] Noah Nelson: All right. Let's start talking about this hardware, the Quest Pro. Scott, you got hands-on time with it. Tell us a story. How was it? Does it feel good? Is it interesting? Is it fun? What about those little cameras on the controllers? Should people upgrade to those?

[00:34:30.473] Scott Stein: There's a lot. There's a lot to break down. I saw six demos.

[00:34:34.736] Noah Nelson: It's like the Gillette of cameras, too. It's like five. Jump into five. I know, it is.

[00:34:41.102] Scott Stein: Even the controller with the cameras kind of looks like a razor or something when you hold it from the side. It was very set up. To tell you what it was like in that space, I went to the headquarters or went to Meta Reality Labs. in Redmond, where the research group is, a couple of nondescript buildings. And we got a very controlled tour, which I was trying to talk about. The first part was in a set of a little mini E3 type of thing, where there were six demo spaces that were all set up. And we got run through each one of them. And then I got to look at the headset on a table. But I basically spent time you know hour and a half or so or two hours like in the headset doing those things and First of all the fit was very like I says very hollow lines and also I'm used to other ones with the battery in the back it fit over my glasses, so it just kind of It didn't lift up like a HoloLens, which would have actually been nice. You tighten it, and it went on very easily. It was surprisingly open at the sides, which at first I was like, what's going on there? But it does come with these little side blocker things, and you can buy a larger blocker.

[00:35:46.553] Noah Nelson: That was the main thing. When I first saw a picture of it, I was so confused and concerned by. I was just like, what are they doing? And now it makes sense, and thank goodness for the magnetic light blockers.

[00:35:58.353] Scott Stein: Yes. And I don't know to that point how good it is as a VR headset, because I didn't really have a lot of demos that were like that. And that might be telling, and it also might be where they're positioning this. So it's hard to tell until we do the review. But the demos all seem to capitalize on the light exposure, in that what I thought was really interesting is that you got this pass-through camera technology, which I've seen on other types of tech. I got to spend time with the Vario XR3, which is very hard to set up at home. And it's a super expensive headset. But that had a really high resolution scan of your outside world with cameras and would put things in it. It achieves a similar type of effect, but with lower resolution color cameras. So it's better than the Quest 2, but definitely worse than your eyesight. But you kind of adjust to it. And then you drop things in. So it's depth scanning the space like an AR headset. And I couldn't tell to what effect over a Quest 2, but it definitely seemed better. A painting VR, I walked over and there was an easel in the corner of the room. And you could do this with Quest 2 with passthrough, but it looked better here. Grabbing things and putting them on a wall, so recognizing space structure and things like that. And I think what was interesting is that the field of view with the exposed area on the side made it feel like I was lowering a lens. It made it feel more like it was in the space. It kind of created this aura of reality in my peripheral vision. So I thought it was kind of a clever trick that it did.

[00:37:28.051] Noah Nelson: So how does it compare to, say, the feeling of having a HoloLens or a Magic Leap on, where the objects – because there's no occlusion in terms of the actual reality. Things are being hologrammed in front of you, as it were, on the device. But does it feel like the same, or maybe even a fuller version of that?

[00:37:51.103] Scott Stein: I think it's a trade-off. I think you get advantages over one versus the other. The one thing is that you're not seeing the actual real crisp world around you. But the thing about AR that's always bugged me is that you get this kind of Ghostbusters thing, where it's like the Haunted Mansion, where objects are glowing real. Not only are they translucent, but they definitely seem to glow.

[00:38:13.277] Noah Nelson: Everything's a Pepper's Ghost. Everything in those things is a Pepper's Ghost.

[00:38:16.847] Scott Stein: Yes, everything's a Pepper's Ghost, and you sort of have to deal with that. It's certainly never going to seem photo real. What's interesting about the pass-through is not only do you get opaque things, but the Magic Leap 2 can do a bit of opacity, but it's kind of like an optical trick that they're doing. But you get opacity, and things feel a little more fluid to me as a result of that. But you're trading off a slightly fuzzier background. But it depends, I guess, on what you're looking for. I guess if you want heads-up stuff in a real environment where you have to do mission-critical stuff, you probably want the Pepper's Ghost hovering and guiding you. But if you're trying to do some 3D modeling work, and you just don't want to trip over things, I would think that the pass-through is a far better way to go.

[00:39:02.174] Noah Nelson: To Todd's point earlier, doing theme park 3D modeling, I was just thinking about, I could see people on the volume where they shoot The Mandalorian wearing one of these, so you see what's going on in the stagecraft setup before they roll it out to the rest of the room, and then they roll it out to the rest of the room and everything matches up. Right because you don't need you don't need mechanical levels of precision, right? Like that's can that's some of the chatter that we've seen online is like Oh for safety reasons you wouldn't want to use this, right? That's what some people have been saying

[00:39:38.133] Kent Bye: Yeah, well, I think there's, when I take a look at it, I didn't have a chance to do hands-on. So I was basically scouring the web to watch all the reviews, read everything and try to, I got access to the press kit and I was reading through it. I think it's worth mentioning the information did a report a while back in May where they said this year was going to be Quest Pro. Next year is going to be Quest 3. That's the Stinson VR. 2024 is actually going to have a refresh of both of those. You're going to see a Quest Pro 2 and a Quest 4 coming out. And I think coming up in the next couple of months, Qualcomm's going to be announcing their new chips. So the chip that's behind the Quest Pro is not even like a full upgrade. It's like a plus. They kind of made some adjustments so you have a little bit more RAM, but it's not really a full upgrade of the next generation of the XR2 from Qualcomm. Yeah, it's a half step. Yeah, it's like a half step. And so it's not a full upgrade. I have had a chance to do the links are one, which is one of the mixed reality pass through where you do have this kind of open ended where you can see that in your peripheral vision. And my experience was that, like Scott was saying, it does feel a little bit more like the HoloLens or the Magic Leap where it's an augmented reality. But because they're doing the pass through, they have much wider field of view. And if you compare the price of like the $1,500 of the Quest Pro to like a HoloLens 2 or a Magic Leap 2, it's like twice the price. It's in the $3,000 range. And so in terms of like thinking about the long-term of developing augmented reality, like Meta's approach is to develop from this pass-through mode of you know, using the, the VR to the kind of the, the computer vision and AI to do the fixing of the, because you have a distortion of for where the cameras are. Right. So you're going to basically, from my experiences, I don't know what your experience of this was, Scott, when, when you have the cameras that are offset, then you have to kind of like do some corrections so that when you see your hands, it doesn't look completely distorted. Like when I did the Vario, you have this disembodiment of. The proprioception of where you expect your body to be and what you see your body as, there's a disconnect there because of the offset of where the cameras are.

[00:41:38.059] Noah Nelson: That's a really good question. Scott, did it have that feeling to it? Is there this disconnect between your astral form and your physical form as it were?

[00:41:46.845] Scott Stein: No, it felt fine. I mean, I've used enough of these that I might have accommodations where I'm sort of like, it's like, I think you get used to VR sickness over time, or at least for me. So there are some things I kind of accommodate for, but no, it seemed usable. I don't know how much time I'd want to spend using it. That's the interesting question. Like I used it 15 minutes at a time in each of these demos.

[00:42:06.375] Kent Bye: Well, you can only use it for up to two hours apparently, because the battery life doesn't last for much longer than that. So that's the other big thing.

[00:42:13.448] Scott Stein: Yeah. Up to two hours at best, it sounds. The one to two hour estimate that they gave makes me wonder when the one kicks in. I would think mixed reality is maybe a part of that. Yeah. Then the battery life on the controller is maybe four to five hours, although it's still not exactly clear from the conversations that Baz and others had on Twitter.

[00:42:32.655] Noah Nelson: Yeah. They're being really cagey about that in a way that does not feel good.

[00:42:37.397] Scott Stein: They obviously know the battery life of the game. I mean, there's a known battery life for these controllers as a product that's being released. I would assume it is four to five hours. Maybe it's maybe it's damage control.

[00:42:48.104] Kent Bye: I think it. Yeah. I don't know. I think there's a part of it, the legitimacy, like let's say if you're playing Beat Saber versus if you're doing something that's a lot less. I mean, there's the controllers are computers now, so they actually have all the computer vision inside of it. So it has less occlusion, but you trade off with having less battery life. Now, what is interesting is. And the haptics too. Yes. And there's new haptics. They are selling those. And so if you want to actually try them out, you can buy them for like $300 and use them on their quest too. So they are backwards compatible. So I'll be curious to see, like, I'm, I'm not like at the point where I want to buy a quest pro. I think it's like a lot of money and I don't know if it's going to be significant, but I would consider buying the controllers just to have the experience of some of the different applications that are coming out that may be taking full use of those. Things like ShapesXR, they're using the stylus to be able to do design stuff. Gravity Sketch was one that they were also designing. So they have some upgrades for this new capabilities that are coming in the controllers. But you could potentially get some access to that just by buying the controllers without having to spend $1,500 for the Quest Pro.

[00:43:50.580] Scott Stein: You could at the price of a quest to yes, practically, you know, the controllers, I liked that they feel compact. The haptics are good. I wouldn't say that they're as good as the PlayStation VR two, but they definitely had a like they had a rippling type of sensation. If I was painting the drawing. had like kind of a scratchy responsiveness, which was nice. I felt like it was actually helpful for air drawing. They have stylus tips on the bottom now, which is trippy. They already have the ability to do that in work rooms to draw with the stylus on the back of the controller. I'm curious how useful that stylus tip will end up being in practice, but it's definitely a sign that they're serious, I think, to some degree about using these as work tools. There's also a really clever thing that they added on the side. The angled edge is a pressure sensitive area for pinching. They had this thing about picking up darts and throwing them or squeezing a little stress toy. And the more I squeezed it, I was able to have different impacts. And I found that really useful versus the kind of crab claw grip for darts in VR that I normally use. The pressure pinch felt a lot more functional. And I thought that was a very interesting stealth thing.

[00:44:55.488] Noah Nelson: Is that down by the bottom, so you'd be holding the controller in the stylus mode?

[00:45:00.390] Scott Stein: Is that- No, you're pinching between the trigger and the angled side. Oh, weird. So it's like you're in a side pinch. It's weird. I don't know how it would feel all the time, but it's intriguing for pinch to pick up.

[00:45:12.555] Noah Nelson: Yeah.

[00:45:13.915] Scott Stein: One weird thing about the cameras, I have no idea if this will be implemented, but they use a Snapdragon 662. Theoretically, Qualcomm had mentioned that you could use those cameras for mixed reality purposes. I don't think they're doing that at this point, but chip-enabled peripheral controller with cameras is interesting. Magic Leap 2 has a camera-enabled controller as well. I don't know more specifics on how that one works. But it reminded me of the Magic Leap 2 controller, because I'd seen that earlier this year. But yeah, that's my question, is how useful the controllers will be if you're a Quest 2 user. They do promise more accurate tracking. I feel like the whole package, though, going back to the Quest Pro, it felt to me, and I didn't even get into face tracking, which felt OK. But there weren't a lot of demos to show that off. It felt like this platform is also about meta needing to advance certain technologies. that it doesn't have another way to advance them through yet. So eye and face tracking and AR, because there's no AR headset yet.

[00:46:19.497] Noah Nelson: Given how much we watch them iterate on some of the software UI of the Quest platform across Quest 1 and 2 and really advance it, like we got hands, we got refresh rates up, we got all these things that These devices feel like they evolve as they're out in the world. It makes me wonder if part of their R&D is just the sheer amount of data they're pulling as they perfect their AI models, as they keep on teaching the software how humans behave, and are just going off a larger I got one clarifying question, and then I want to check in with Todd about how he's viewing the Quest Pro as a current tool for play, or maybe a potential tool for play. And that's this, Scott, that Qualcomm chip that's inside the controller, can it play Doom?

[00:47:13.974] Scott Stein: Yes. If you break it open like a chocolate bar, it's in there.

[00:47:17.255] Noah Nelson: Okay. So we can play Doom just with the controller alone. All right. Well, if nothing else.

[00:47:22.736] Kent Bye: I wanted to, uh, just one quick note about the face tracking is that part of the reason why it's open underneath is because of the cameras that need to be able to see the face for face tracking. And John Carmack did say during his talk that it will be very much similar to the hand tracking where they're going to be deploying more software updates. And so it will improve over time, but there is an additional light blocker for $50 as an accessory. If you want to have more of an immersive VR experience. But from all the gamers that I saw review it, they're like, this is not a gaming unit. Don't get it for gaming because you're going to be disappointed. It's only two hours. And just from the controllers and everything, it wasn't really optimized for gaming. Like Scott said, it's much more like an augmented reality prototype development devices, and also productivity.

[00:48:05.485] Scott Stein: Plus, if I could add one thing about that, there's also a question of how many gamer apps will even update for it. When you have something that's so specialized and will have a much smaller footprint, unless Meta is bankrolling those apps to be updated, I wouldn't think it's in the interest of any games to really update for Quest Pro.

[00:48:23.395] Kent Bye: Well, they'll be backwards compatible, so you'll be able to play the games, but yeah, to be able to do. You can play them all. To take advantage, then yeah. The biggest thing that I saw was to be able to like bring up browsers in line. So if you want to open up a browser while you're working, that's the big thing that you can't do on a Quest 2, but you could do on a Quest Pro.

[00:48:41.101] Noah Nelson: I was going to want to get Todd in here. So Todd, from what you've heard and from what you've seen so far, would you concur with the guys and what the reviewers are saying that this doesn't really feel like it's going to be a device for gaming? Or do you see anything in these technologies that they're rolling out that could, in the long run, affect the way we play?

[00:48:58.549] Todd Martens: Yeah, I mean, I would agree with the sentiments that have already been stated. And certainly as somebody who covers gaming, I was like, oh my God, I want this new device. But then once hearing that there is going to be an update to the Quest next year, at least planned, that sort of seems sort of like that's going to be geared more toward the consumer pitch, the gaming pitch, and with more sort of gaming updates. All that being said is I am incredibly excited for and pass through light augmented reality gaming experiences and cubic gaming experiences. I think there's a lot of potential in that space. So that was what excited me about the Pro. But I think if you're purely using this for entertainment or gaming, you're better off sitting it out.

[00:49:42.647] Noah Nelson: So one of the things that I've started thinking about when it comes to the role of the pro in the immersive ecosystem, is I think about some of these immersive theater in VR experiences that have been going around the festival circuit for a while, or things like Alien Rescue, which have been open to the public. And I could see a near-term future where the producers of these events, once that face tracking is opened up to other platforms, I think right now, like VRChat, for instance, I think I saw something in your feed, like they can't access it at the moment, but I assume... Yeah, there's early unconfirmed reports that a lot of the features like eye tracking and face tracking will not be available through Oculus Link to PC VR.

[00:50:33.200] Kent Bye: It'll only be available to standalone, which basically means that a lot of the features that are on there in terms of hardware won't even be opened up for VRChat or Nios. That's as far as what's been unconfirmed and reported so far.

[00:50:43.852] Noah Nelson: Yeah. And if that remains the policy and remains completely true, that feels like cutting off their nose to spite their face. Because in a world where you could have performers using the pro to get a more fully fleshed out performance, in an asymmetric experience. You have players using a quest, no face tracking, but they're interacting with live performers who are getting a broader range of expression. particularly as the avatars. We'll open up the avatar discussion now. There's so much potential there for that kind of asymmetric production between game master characters, non-player characters, and player characters to create a more robustly immersive environment, to create more magical experiences and a better sense of play because you can have face rigging on the characters people encounter in real time. And, given what they're doing with some of the near-future stuff they showed when they went up to Abrash's laboratory, there's some real potential there.

[00:51:55.195] Scott Stein: I think it's the most exciting part of this, and I think that's the pro part of it. I think there is a real value for this, like you said, for people who said, I want to use this more for certain things, and hopefully it'll be opened up for that. I thought about performances, certainly. I also thought the flip side when I was using it. They had a TribeXR demo for this DJ kit where you could then see outside with passthrough. I started to think about, could these be used, maybe not in the factory per se, but could you use it essentially as a stage manager in an immersive theater experience? Could you be using a virtual light board or could you be using these as live control sets when you're in physical spaces? I just think there's so much stuff to explore with both eye and face tracking and with augmented reality on a headset, that there are so few devices that have been doing that, that I think that this could be with the ecosystem that Meta does have. It opens up a lot of possibilities. I mean, I think about the fact that Niantic had a partnership with Microsoft to look at Pokemon Go or Pokemon on HoloLens. You know, because there's just so few AR headsets that I think anyway, or Snapchat, you know, Snap Spectacles that only have like a 15, 30 minute battery life. So I think anything that you can use to start prototyping, field testing, basically I think of the Quest Pro as like a dev kit where people can start building experiences for those future devices to come and start getting, maybe not even to build the algorithms possibly, but also just to get your sea legs, understanding what you could do.

[00:53:27.172] Noah Nelson: Speaking of legs, avatars, Kent, I know you're excited to talk about where the avatars are going, both in the near and long term. So take us down a lengthy hallway that we can actually walk through now that we're going to be going to the legs soon enough.

[00:53:46.588] Kent Bye: I think the avatars are one of the areas that you'll see meta starting to bring some of the technology that they're developing for VR, but into their other contexts. So for Instagram and for these other kind of like bitmoji type of characters that you'll see on Instagram and what they were saying was basically having this vision of having an ability to buy avatar goods that can go across all the different platforms. But again, I think this goes back to this challenge of like the stuff they're building internally, but then also facilitating open ecosystems. Just an example with the face tracking and eye tracking, I imagine that that's going to be turned on first in Horizon Worlds, which is their own first party app that they have control over. And they were talking a lot more about privacy. I've been covering privacy a lot. And I think that I'm wondering if there might be some ethics washing happening in terms of like telling a story of privacy, but really the functional outcome is that it's really only their own first party apps are going to have access to some of the technology. So I'll be very curious to see if say like rec room or VR chat have access to some of the different eye tracking technologies or facial tracking, or if they'll try to say, well, it's going to be locked down because of various privacy concerns. But just in general, a lot of the stuff that they're saying around the avatar stuff is on the one hand, they said that the forward future looking of the Kodak avatars, which are super photorealistic and kind of creepy and scary on some degree. But then on the other end, we're at the where we're at now, which is like more of a cartoon stylized Pixar style. in with these high fidelity emotional expressions and eye tracking, which it's kind of like they're giving us a far future, but also like where we're at right now. And I just want to sort of point out that the avatar area is the one area where Meta is going to be able to potentially leverage some of this work across some of their other platforms that they already have.

[00:55:28.361] Noah Nelson: Yeah. Let's dive into Kodak for a second, but Scott, you had something.

[00:55:31.983] Scott Stein: Yeah, I was going to say that they are opening up to other apps. The way they talked about it was that it's it's off by default, which is interesting on the Quest Pro. you turn it on for each app that asks permission. But they did kind of punt on saying that they made a big statement about everything being encrypted on device and then data being erased over time. But we asked questions about, what about an enterprise or what some other app that wants to tap into permissions. And I guess it's similar to data from any sort of tracker you might have. How would that be used? It's a little unclear. Even if it is encrypted on device, what other things could pass through or how could it be utilized? The other interesting thing I want to note about the eye and face tracking, surprised me, is how they did seem to stop, maybe not on the privacy part, but on the effectiveness of it. There was a lot of foveated rendering going on with the PlayStation VR 2 and on almost every game I demoed, no foveated rendering on the Quest Pro. And Per Zuckerberg or Baz and others, they said that it was kind of not the slam dunk you think it would be because of the trade-offs in battery life and performance, which makes me wonder, would that only be turned on when it's tethered? And that goes back to your question about Link. Why not optimize those eye tracking features when you have link or more powerful tethering capabilities, because it sounds like those may be few and farther between than we might be expecting.

[00:56:53.067] Noah Nelson: there's a little bit where you feel like we keep on hitting up against the physics problems here. And just that balance of energy use, heat, power, weight, all of that stuff. These are much bigger boulders to push than everyone wants them to be. And the drive to just get cool stuff onto people's faces is maybe driving everyone a little nutty because there are so many good experiences you can have. You don't need to be constantly pushing the envelope. But in a world where your major target for your communications are not the consumers, but are the stock market, and you want to try and drive that stock market up, they only care about potential down the road. It feels like a constant tightrope being walked while an actual market needs to be developed if this stuff is going to truly be profitable.

[00:57:47.792] Scott Stein: I did try those avatars, by the way, the Kodak avatars at Mattis Labs.

[00:57:51.788] Noah Nelson: Who did you get to drive? Did they make one for you or did you get to drive somebody else?

[00:57:55.329] Scott Stein: No, I got to talk across the void to others. So they had a little avatar among some of the future research demos, which are ones they've already announced for the most part, a quartet of demos that I wrote about on the site. There's a video about it, but Kodak Avatars 2.0. So I got to actually try that out. It was similar to the YouTube video. A guy named Jason at Pittsburgh led the conversation with me. It was like I would equate it to a candlelit, eerily real PlayStation 5 cutscene type thing. It's very compelling, and it started to feel very real. But it also had an element of the uncanny, but I got kind of very close and I felt like I was in this intimate conversation because it's lit in a void is how it feels. You feel like you're in this intimate little closet space with the person. And then I tried instant Kodak avatars, which was the phone scan one that A. R. Shett talked about on stage, which is you get a phone scan to kind of shortcut it and make it a little more powerful. It looked good, but it's frozen in space. the head being kind of glued to a pedestal is what it felt like. And again, haunted mansion feel where the face was very expressive. But if I were more uncanny and if I was to talk to someone like that for a while, I'd go, why isn't your head moving that much? What's going on there? Were you feeling okay?

[00:59:11.291] Noah Nelson: There's always room for one more, Scott.

[00:59:13.371] Scott Stein: Yeah. And then they had that 3D, the room scanned full body. It was really a scan of an actor who had been scanned ahead of time And they just showed some of the digital clothing being draped on that, which looked good, but it was a non-interactive demo.

[00:59:27.342] Kent Bye: I have two quick points I want to make. One is that, again, a lot of this hardware stuff, as long as it's being delivered on the platform, they're going to be able to, over time, improve things via software. So things that may not be turned on now, they may turn on later, or they may not turn on because of the privacy argument that they're making. And the other thing, just in terms of the privacy, is that by saying that it's encrypted, that's really focusing it as making sure that people don't have access to the data to know what your identity is, but there's this whole other realm of inferring information. And as far as I can tell, as long as they do real-time inferences, then that's a whole nother layer of being able to drive utility and information out of that data from both the eye tracking and the facial tracking, be able to do real-time sentiment analysis and what you're paying attention to. So those kind of inferences is something that's not really covered in privacy law right now. And so some of the arguments they make doesn't necessarily fully cover the full privacy implications of these new technologies.

[01:00:21.534] Scott Stein: Right. Like a company, seeing if you're paying attention or where you're looking at a heat map of your gaze in a meeting or something.

[01:00:27.823] Noah Nelson: It feels like, aside from the fact that we just need savvier legal models to deal with these issues, that at the end of the day, it's going to be a lot about retention of that information. Like, what can a company hold onto after it's had an interaction with you, and how long they can hold it, or which things they're allowed to hold. And maybe that even spills into reality. If you were to walk into a store, walk into a bank, if I walk into an Amazon Fresh or whatever they're calling, Amazon Go, and they've got the cameras on me, they can track all the information as well. How long can they hold onto it? Right now, I imagine forever, they can know exactly what I'm interested in, and then suddenly start sending me ads. It's like, you were lingering in the Star Wars toy aisle. It's like, of course I was. It's me. Way to go, guys. You already knew that. But it's like, oh, you were really checking out the Kumquats today. And it's like, yeah, okay. I'm Kumquat curious. We're just about at the end of the hour, and I promised you guys only about an hour of having to deal with me. One final question. It's not about the price per se, but do you think that this event they've had this week Particularly because this is their big temple for at least the next few months. This is a major product rollout that they've got going on. Do you think they've positioned themselves and the pro well, in terms of who this is for and where it fits in their overall plan? Or, have you seen and do you feel that there's more ambiguity and questions than answers now that this is out in the world? Todd, we'll take it to you, and then Kenton and Scott.

[01:02:08.237] Todd Martens: Yeah, I don't want to start, but I would say, because I haven't had a hand on experience with it, but I mean, you're based on the presentation. I do think they, you know, presented well as a productivity sort of device, a work oriented device. I think my question is if that market exists.

[01:02:27.172] Noah Nelson: Yeah, straight up, straight up. Does that market exist? Kent, how about you? Do you feel like they've done what they needed to do this week?

[01:02:35.816] Kent Bye: Well, I feel like it's an open question as to whether or not the Quest Pro is going to find enough utility. I feel like there's a lot of exciting things with eye tracking and facial tracking and the mixed reality modes that in terms of like the future of developing augmented reality devices, this is going to be probably one of the best piece of hardware out there to do that. In terms of finding a market to solve problems in the enterprise, I think there are partnerships with Accenture and Microsoft will probably drive a lot of that. And maybe this is kind of like a thing that I don't have much insight into how that's going to go. But I guess in terms of a final thought, the one thing I just wanted to also point out that they showed was the control labs, neural interface things, which I think is personally some of the most exciting work in terms of being able to basically control technology just by thinking about an intention to move. So being able to isolate down to an individual motor neuron. I had a chance to talk to one of the directors of the motor neuro interfaces, Thomas Reardon from Control Labs. And yeah, he told me that, you know, that's for me, at least, and they started to show some demos. That's kind of like the future of where this is all going is like a whole new paradigm where you're kind of just thinking about moving. And that thought is able to be detected by these Control Lab EMG devices. And for that, that's so mind blowing. And so for me, I see this in the long term of the next like five to 10 years. And so it's just like an incremental step. And people may not know what that full trajectory is, but by looking at some of those technologies where things are going, moving from 2D to 3D is such a huge shift. And that this is one incremental step on that larger path, but in the short term, how much they'll find success. And for me, the battery life is probably the biggest question, but other than that, it'll be up to the market and the use cases to decide. So we'll kind of wait and see.

[01:04:14.586] Noah Nelson: Scott, before giving you the, did they market this right? Did you get to see any of the EMG wristband action while you were on your tours?

[01:04:22.607] Scott Stein: Yeah, several times. So I saw a lot of stuff. You should watch the video, read the article, definitely have seen it. But I can spend a lot of time talking about it. The one thing I didn't get to do is actually use them myself. And I think that was telling. Because all the other demos we did get to experience, I got to try really realistic spatial audio that they had talked about back in 2020. You have the avatars, a new 3D scanning type of technology, which looked really good. They scanned my shoe, and they also showed some pre-scan, like per the presentation, I got to look at the fuzzy teddy bear and the cacti, which looked really good for 3D scanned objects, although they were pre-prepared. And the EMG at Thomas Rudin was, when I went to the Reality Labs Research, Thomas Rudin led us on the demos. Michael Abrash, I got to speak to. Mark Zuckerberg was there and Boz. And it started, Mark Zuckerberg demoed the wristband for a few minutes in front of us and we were sitting there watching him do the version of what he was doing in the presentation. And it looked, having talked to Abrash about this before and seen this, it kind of looks like little thumb movements, little pinches or little things, kind of like an air mouse or something. And he was joking about how it almost looked nearly invisible. The other demo that we saw were two researchers who had spent more time practicing with this and were playing the game that they showed. This is what Michael Abrash was showing in the presentation, that they were able to eventually train down and move their hands less and less so that they were only activating apparently one motor neuron. As we watched them sitting at a table, like a science fair, we saw basically no movement. I tried not this, but I tried another neural input wristband. after one CES, where somebody let me try a little bit of some of the gestures and motions, enough to get, I think, a sense that how these types of inputs work. But it's fascinating. What's interesting to me is that, and I asked about this as well, is it seems to me to be being positioned as a whole new interface for everything, ideally. First for smart glasses, and then maybe for VR, maybe for all things in the world. In a timeframe of about five to six years from now, one will start finally seeing it And my question was, are we going to be ramping up through other algorithms to other interfaces that we recognize? Or are we learning a whole new thing at that point? Because they talked about co-adaptation. Co-adaptive learning was what Michael LaBrush was talking about. The idea that machines and people will coexist. And that's already happening with a lot of other AI. That's a lot of interfaces, I think, to some degree, use elements of that, I think. But how much is this an intuitive process to learn? They made it sound like it learns from the way you learn. or use things, and it gets better and knows how you move. But to get to the point where that's intuitive, that's a big leap. That's everything. I need to see that proven, or I need to use it to know that.

[01:07:14.152] Noah Nelson: I started thinking about playing a game, and what, in time, the game learns how I do a particular combination and then can just read the impulse based off what I'm thinking at my hands. That sounds fascinating and cool, but is it even possible? There's always also the question, does it actually solve any problems? Does it actually make things better, or is it just a cool thing to do? Sometimes with this stuff, it's just a cool thing to do.

[01:07:48.870] Scott Stein: I tried NexMind, which was that one that got acquired by Snap. It sits on the back of your head and looks at kind of coded areas in VR. So it's using your optical recognition, but recognizing where neurons are firing back your head. And these types of technologies are fascinating. I think that the rest stuff has, yeah, it's like, I mean, I think about things like the Apple Watch or other watches and how is this stuff going to start feeling like an interface and when's Meta going to make a watch that does this? It intuitively makes so much sense for the future of where things are going, but I think there's such a leap to come with how you get from the controllers to that, that I'm really curious to see when that starts.

[01:08:29.536] Kent Bye: Well, the Alex Heath did a report in April saying that in 2022 was going to be meta smartwatch one and then 2023 smartwatch two and then 2024 was going to be the smartwatch three with the EMG XR input. So, but we're already into 2022 and we obviously didn't see a watch. And so that may have slipped for a variety of reasons, but. They were basically saying within their third iteration of their smartwatches, what they were showing now is what Alex Diaz reported back in April for when that technology would start to be in a consumer device. But it would take three iterations to get to what they showed at Connect this past week into an actual consumer device. And so it's a pretty far out, even if they were to take the optimized path, because they're imagining this as an input for AR. And in order to do that, they have like their AR roadmap and then their smartwatch roadmap as well. So, yeah, we'll see.

[01:09:19.085] Noah Nelson: Well, the predictions whiffed, and maybe the structure of the roadmap is still there, but just the time isn't happening.

[01:09:25.249] Scott Stein: But Scott, to bring it back to... I was going to say, one response to that, too. I think what's interesting is that you think about the arrival of neural inputs as one thing, but I think the path is more of a slow melt. The Quest 2 adopted some AR elements, and you're going to see Quest Pro have it. The point is I think by the time, I would imagine the time you get to such a device, you've kind of already arrived. And I wonder if that'll be the case for them between other types of ways that AI and other types of inputs could start to approach those types of gestures. I mean, some things like Apple Watch has some accessibility related gestures already. It doesn't have neural inputs. You could get to some of those things, approach them, and then And by the time you get to the neural input wristbands, you have a familiarity, but it just becomes better. I feel like that's the way it needs to go.

[01:10:14.856] Noah Nelson: I think you're dead on right on that. Let's snap to the final question though. Did they do this week, all the future stuff being what it is, and it's all very exciting for this Quest Pro and for this year ahead for them, do you think they did what they needed to do in terms of making the current lineup a success? Or is this really a looking in the future kind of moment for them?

[01:10:35.268] Scott Stein: feels like a half step. And I think that literally to the XR2+, it's a half step. I think the problem with the presentation is that it's a needed step. I think it's an important step. But for all the people that have Quest 2s, I had people who were asking me, are my kids going to be able to play games on this? Or does it do any other types of games? There's a disconnect. There's an existing platform out there that does not resemble this, the way they talked about it. And so I wonder if in the spring, maybe they need to address Quest 3. Like we just said before, the chipset thing is kind of like a refinement at the end of the line. It feels like there are bigger changes to come. certainly by the time Apple enters the space and then others, it's going to be rapidly moving. And so it'd be interesting to see how they address that, because this is a half-step pro version with big changes, but two years after the Quest 2, and it's an unusual staggering there.

[01:11:32.697] Kent Bye: Yeah, I would, I would just go back to what the information reported back in May 2nd, 2022. This is the Quest Pro next year. And Brad Lynch on, sadly it's Bradley on YouTube has been reporting leaks. And so he was already reporting leaks estimating for next year's Connect as being the Stinson Quest 3. And then the next year 2024, like a refresh of both potentially of both the Quest Pro 2 and the Quest 4. So we're seeing at this moment a fracturing of both the high end, and then we're going to be going back to the consumer device, but then kind of refreshing both. And so they're kind of coming at it at two different angles. And so we'll see, we'll see if it is successful. And if it's not, then they'll probably just stick with the consumer end.

[01:12:12.162] Scott Stein: I was gonna say one presentation they showed us, they showed the two lines continuing and the dots were staggered. I don't know if that's an indication that that's going to be the way the map goes, but like, I'm curious, when they do the Pro and the Consumer, will it be kind of like the iPad and iPad Pro or something? How are they going to talk about those and will there be two different events for them?

[01:12:34.042] Noah Nelson: This is the thing at the end of the day, and this is where I think we'll leave it off on. I don't think that the strategy is a bad one to have this pro device aimed at enterprise, aimed at creators, testing out new technology for the people who are super enthusiasts who just got to have the latest and the greatest. But the fact that they didn't communicate a reminder that, oh yes, the next consumer headset, the 3, will be out next year. That is still going to be a thing. That they didn't communicate that in this week's event very clearly shows that, yes, they don't want to cannibalize any of their sales. They don't want to prevent someone from buying a Pro if they wouldn't maybe jump in because, oh, I'll wait for the 3. But I think they may have undermined themselves a little bit by not making it very clear who these things are for and differentiating those, because I think it leaves a lot of people confused as to why there aren't more games, and is this something for me to purchase for my kids or me to purchase for myself if I'm not developing. That's something they can't keep doing going forward if they want to make both of these devices successful.

[01:13:45.352] Scott Stein: Ben Thompson interviewed Mark Zuckerberg and Satya Nadella yesterday, where the story went up yesterday, and that was where Zuckerberg actually mentioned the Quest 3 and said that the price is going to be somewhere around $300, $400, $500. which was helpful. I would have liked that on stage. It would have been good. Because it's helpful. Exactly. Going to the holiday season, that's why we're trying to guide people. It's like you might want to wait because it does sound like the price is going to be aimed to be similar. That's kind of a subsidized price that they've been getting at that point, which is similar to ByteDance and the Pico 4. That's maybe not the true price of what such a device might cost, but it won't be $1,500 either.

[01:14:29.073] Noah Nelson: Well, gentlemen, thank you all for hopping on the pod today and always good to talk with you. And I know I'm going to be touching base with you all before the year ends. We'll draw us back in some configuration to chat some more. Going down the line, Scott, where can folks find you on the Internet?

[01:14:45.138] Scott Stein: You can find me at Jetscott on Twitter and also at CNET.

[01:14:49.419] Kent Bye: Kent, where can they find you? I'm at VoicesofVR.com and where you can find podcasts as well as on at Kent Bye on Twitter. And Todd?

[01:14:59.422] Todd Martens: over at the LATimes.com for the game stuff. And then Twitter is just Todd Martins.

[01:15:05.405] Noah Nelson: All right. Once again, thank you all. And always a pleasure to talk with you guys.

[01:15:10.247] Kent Bye: So that was Noah Nelson of the No Proscenium podcast, who was moderating and hosting this podcast. And actually, this is rebroadcast of the No Pro podcast. So definitely go check out the No Proscenium podcast and all the great stuff that Noah is doing over there. Scott Stein of CNET, as well as Todd Martins from the L.A. Times and myself from the Voices of VR podcast. As I listen back to this conversation, I notice how sometimes wanting to really fix on whatever a narrative is as to what is happening, what's going to happen in the future, I'm really resistant to that more and more. I just become a skeptic as to, I really don't have enough information, really, as to know what's going to happen in the future. I think there's some stories that I have in terms of what's happened in the past, but as things move forward, I think it's really going to be like, look at the artists, look to the creators and look to the other people who are doing the kind of hands-on reviews. What I would love to see is, is it going to be able to do what I do, which is do a lot of editing of podcasts? I think, you know, in terms of coding also, I've been wanting to do more stuff with, I just recently relaunched my website and there's a lot of other stuff that I want to do and expand out. is going Not a full upgrade. We're going to be learning more information about the next iteration of the XR2 Gen 2 from Qualcomm, coming up in their next press conference that's coming up here in November. I recommend checking out Sally Itzbradley, Brad Lynch, who's been doing a lot of leaks of lots of different stuff. I actually did an interview with Brad a year ago. I'll be revisiting some of his predictions. He's evolved quite a bit as someone who's taking a lot of different information and leaks, but also doing deep dives into patents. actually developing a lot of sources within the supply chain and people from within these companies to get a sneak peek as to what's coming. He actually had like 17 out of 24 of his predictions were correct on the MetaQuest Pro. So he's a hardware analyst who's been really spot on. He had some early indications of the Quest 3 that's going to probably be, as far as his sources are telling him, next MetaConnect, so a year from now we'll see a refresh of the ConsumerQuest and time for Christmas, which is what we've been seeing in terms of coming up for these hardware releases ahead of the holiday season. You tend to get a huge bump for each of these commercial technologies, so expect to see a lot of other software releases and other stuff leading up to the holiday season. But Guy Goodwin of the VR Desktop, he's somebody who, you know, in terms of like taking PC streaming and streaming into the headset, there's a new upgrade of the Qualcomm XR2 Plus chip, which has a little bit more headroom for RAM, about 12 gigabytes, which means that there's going to be more things that Guy said he's going to be able to do a better streaming experience on the MediQuest Pro. So there actually may be a significant bump when it comes to having a better streaming experience. So definitely look to virtual desktop, look to immersed, which people to do these screens. And is this going to be good enough to be able to do work? If you do coding or other types of work and you want a lot of screen real estate, then that's one of the questions that I would love to hear from other reviewers. Some of the other things, I guess, is the eye tracking and the face tracking. I think what Noah is saying is that it actually is going to be probably a lot of performative aspects of people wanting to have more expressivity and emotion as they're communicating, especially in the context of these theatrical pieces. I know that Kira Bensing is going to be doing a production within Horizon Worlds that's going to be showing at Raindance. I'm on the jury for a lot of the different performances that are happening there. And she's been on the frontiers of not only working with High Fidelity, but then VRChat, one of the first pieces that was at VRChat in terms of immersive theater, Pandora X, and now is doing a new Skits and Giggles that's gonna be within Horizon Worlds. So I'll be really curious to see what her production is like. And as we move forward, is she gonna be using things like the MetaQuest Pro to be able to use that kind of expressivity and emotion? But in terms of like a social presence, As more and more people are doing remote work and you want to have these meetings but still have this sense of being with people, having that expressivity with emotions and eye-tracking is going to be a big bump. The question I have is, that $1,500 price point, is that going to be reasonable for people in the enterprise, on top of what other deals that they have for Quest for Business? I think there's an untold story as to what the heck happened with Quest for Business. They just kind of shuttered it. I think there was like this pivot towards Meta wanting to just go all in into the consumer and just ignore the enterprise. And I think in the long run, that's just going to be a mistake. They got in late. They were kind of slow into getting in the enterprise. I think HTC really swept in and a lot of the different White House tracking was just more solid for a lot of the different types of enterprise use cases. But with the new controllers to have more precise tracking and to not have to, you know, if you're trying to draw and make stuff, The occlusion was a big issue where it was still so much a better experience to have external tracking. And so now that the tracking's internal, is something like working with Gravity Sketch or ShapesXR, are the different types of platforms to be able to do this kind of design work? Or are the controllers going to make so much more of a difference to make it easier to do some of those different design functions? So that's going to be a big question as we move forward. And I think the thing that Scott said and the thing that I come back to is that the MetaQuest Pro reminds him more of the Magic Leap 2 or the HoloLens 2. And I totally agree in the sense that this is more or less a developer kit for the next generation of augmented reality applications to do this type of mixed reality. But using the pass-through of VR is able to use the technologies available to do the best type of wide field of view interactive, hand-tracked experiences, and to really just kind of push the edge of what's possible of blending these immersive spatial computing technologies in the context of these site-specific locations that are away from your normal room-scale space. It's a standalone device that's going to be able to allow people to be immersed within a spatial context and allow them to do specific types of work. The two hour battery life max may be a limitation with the absence of not having like swappable batteries and stuff like that. And I do think that this is an opportunity to get the next generation of hardware from HD haptics that are in the controller that sadly, it's Bradley actually does a deep dive, probably way more detailed into going into the internals than even meta is providing themselves in terms of the guts of what's happening within these controllers, but to kind of have more fine grained haptic interactions. The pinching is one, and the stylus is something that I know ShapesXR was demoing some videos early, as well. And, yeah, the facial and eye-tracking technology. Obviously, as someone who's been tracking a lot of the different privacy stuff, I'm skeptical that they're going as far, because they're still going to potentially be doing biometric-inferred information from a lot of this data. You know, sentiment analysis and taking that emotionality and tying it into all their AI algorithms that are tracking us from a surveillance capitalism perspective. Meta is in this weird position where they are saying the rhetoric about open platforms. As I listened to this Ben Thompson interview of Mr. Techery, Mark said that they've always been about the open platform. That's just not true. If you look at the dialectic between how Valve has really been trying to cultivate this open ecosystem for a long time, Facebook at the time and Oculus were really trying to shut down a lot of those interoperable aspects. They really wanted to create a closed-wall garden destination that, for a long time, their strategy was a total closed-platform strategy. I think over time, they've maybe been forced to become more and more open. But with their collaborations with Qualcomm, that's certainly an area where a lot of the work that they've been doing in terms of R&D and their collaboration, those XR2 chips, Gen 1 and the future Gen 2, that's going to be benefiting the entire XR industry, aside from Apple, who's generating their own silicon. It is really embodying that open ecosystem, that collaboration with Qualcomm, Webex, . . . . . You know, being able to pull up a web browser while you're in another application was a thing that, you know, you can't do in a Quest 2, but you'll be able to do in a Quest Pro. So that's one of the things that if you want to do design work and pull up other multitasking and bring up different web browser and eventually other progressive web apps, they're going to be able to potentially overlay with a browser to be able to have other functionality on top of this. And you start to see this future of having these object oriented modes where you're in a spatial context and you're able to bring in different applications. And so. That's really exciting in terms of the future of work as well. But that's really driven by the progressive web apps and the WebXR. So there is a lot of pushing forward of this open metaverse vision. But at the same time, there is this fundamental conflict of interest between meta being a platform provider and cultivating those open ecosystems and developing their own first party applications. And the real rub comes in from what I've seen, at least, is when someone is developing a third-party application that is in competition to some of the things that Meta is doing themselves for things that they want to build. I started to see a little bit of shift with that during this last keynote, where they were talking a little bit more about VRChat. They don't usually call out some of their competitors for social VR applications. Really haven't really given Rec Room their due in terms of how big they are. But, you know, the social VR applications are the biggest ones. And so I think in the context of this conversation, people are wanting more games. But I think actually things like Among Us, that's going to be a huge game in terms of social experiences and also just what's happening on the social platforms and other social games like walkabout, golf, VR, and other things that have more social dynamics. So I think there's actually, you know, in terms of meta and their values, they really want to focus on that social connectivity, the human connection, the sense of social presence. But I think going back to the thing that I'm looking at is this conflict of interest of at what point is their cultivation of that open ecosystem? Where does that compete against what they would do if they were to do just that versus their own first party applications? Mark Zuckerberg said to the journalists at the meta reality labs, that there's really four areas that they're focusing in on. They're focusing in on the Quest, the VR headsets, the AR, which we really didn't talk about too much, and they didn't actually talk about it too much at all in their keynote, either, the Project Nazare, and their Hypernova, and their more frontier, augmented reality. The only thing that they really said was that they're starting with the glasses first, with the Ray-Ban stories. doing very minimalist adding the VR technology to the existing frames in a non-invasive way, and then going from the other end, which is essentially MetaQuest Pro with a VR headset, but eventually getting it down to more of an augmented reality glasses form factor that are more bulky and down into something that's more of a form factor that's in the sweet spot that we all want to see eventually. They're continuing to do a lot of that work about AR. They don't really have much to say about it. The VR has to come first. Then there's the control labs, neural interfaces, which I think is really exciting. We had a whole long conversation a little bit afterwards. There's a whole question around onboarding with these different types of technologies. There are at least three iterations of the smartwatch away from what their original plan that Alex Seif reported on. But because they didn't even announce their watch, then it's a little bit of a question for how they go from where things are at into a consumer play. I think Meta should really consider, how does this get into the enterprise first? to be able to then prototype it there rather than trying to push it out into a consumer product. But that's an advantage that Apple has. They already have the phones, they already have a watch ecosystem, they already have computers. There's a way in which whatever comes out with Apple, if they come out with a device that's a super compelling monitor where you have a high resolution to be able to have tethered from your phone or maybe connected to your computer or something that is integrated with the watch and to be able to do different types of watch interfaces. These devices aren't displacing these other devices, but they're probably going to be working together in some ways as we move forward. In a lot of ways, Meta doesn't have a phone, they don't have a watch ecosystem. By not launching or announcing their watch yet, with all the recession and everything else that's happened, they've had to cut back some of those things. With the changing economic conditions and their own health as a company, they have to slowly build towards these visions with where the market is at and whatever they're able to make the next logical step. But the last, the fourth thing that they've said that they're really focusing in on, aside from their VR efforts, their AR hardware efforts, their control labs, neural interface efforts, is the Horizon Worlds and their own first party services that they have to really deliver both on the avatar platform which I think is going to be a part of them taking care of identity with integrating ways of like, take a picture of yourself and to be able to have an avatar representation that you feel happy with. Because as we move forward, a lot of these metaverse apps are going to need to have some sort of avatar representation. And I think When they talk about interoperability, I think that's what they mean, is that they want to have their avatars interoperable with all these different devices. If you buy fashion clothes for your meta avatar, that can go across all these different applications. Now, the big questions around that interoperability is if they're going to allow other people to have their interoperable systems on their platforms, if they're going to allow third parties to have their own avatar systems. For me, that's the real question of interoperability, rather than to have interoperability on their end and preventing other people from having interoperable avatars. But I do think that identity and avatars is going to be a big part of their strategy moving forward, at least from what I've seen, as well as their horizon worlds, which is, I'd say, fledgling at this point. They did actually have some of the best design that I've seen, in that their keynote actually had a whole other spatial component that I saw some clips of, but I didn't get to see live. They only showed it once, but they had a way that you could watch the screen, and they had all this stuff kind of morphing around it. And when Mark came out to do the legs demonstration, which within the last couple of days, Ian from Upload VR has reported that that was actually done by motion capture. There was a bit of smoke and mirrors there. And even John Carmack was saying, you know, that there was ways in which that wasn't really using the demonstration of where the technology was at. That all aside, the whole issue around legs and the future of legs, it's sort of a weird marketing, getting ahead of engineering, or just a weird decision to do that kind of deceptive play. But anyway, there was a moment in the keynote where that was a spatial experience, where people who were watching it in VR were able to see more of an avatar representation that was more motion captured. So that was something that had a spatial dimension. But just in general, meta horizon worlds, they have a lot of content that they're putting out and their venues, and they're really focusing on these shared social experiences of people watching these events. different concerts. They have UFC just had a fight pass where they've had one championship. They've had different music videos and other stuff. I've been kind of popping in just to see what curation of things they're showing and just seeing the different social experience. So, I think for a certain population, some of that is starting to take off. But I think in terms of relative to what's happening in VRChat or RecRoom, it's still pretty far behind on certain aspects and really building up a critical mass of of the social network. So we'll see. I mean, that's been a big priority. It's been fledgling, I'd say. And I'll be continuing to kind of check it out and hopefully be able to unpack and do an interview with someone from the Horizon teams at some point, just to kind of get a sense. There's a lot of stuff that's included within Horizon Worlds. They do have a top 100 worlds that you can kind of check out and see and just get a sense. And there is a Horizon Worlds that have first party applications. They have 141 worlds that the MetaHorizons team has put out. So they've been creating lots of different content, but it's difficult to even get a full listing of all the different worlds that they've created, just because a lot of the stuff is still kind of hidden. It's difficult to even see like all the different apps that one person has created. Hopefully they'll be making some of that stuff more available and I'll continue to kind of dive in and hopefully start to unpack that a little bit more as well. So I just wanted to kind of unpack some of my additional thoughts I've been digesting and just trying to get a sense of what the story is. For me, it's a lot of wait and see, paying attention to what Accenture is doing and what the Microsoft partnership and how that plays off. Microsoft tends to announce a lot of things before things are actually ready to launch. And so I'd love to see, you know, some working demos of some of this stuff. Have they got it ready and working? At what point are we going to see Accenture start to have their collaboration, getting these into the hands of different companies that are out there? And what are the different use cases and the different productivity applications as well? And if there is going to be a compelling use case. But I think like Scott said, my big takeaway is that this is kind of like a augmented reality development kit to be able to prototype and develop some of the next frontier augmented reality applications. And yeah, we'll wait and see what the larger market says. And I'll also be really looking towards the creators and the makers and what kind of innovations that they're pushing forward in terms of how to move this forward. I have an interview with ShapesXR that I did on their launch day at Augmenter World Expo 2021. I'll hopefully get that out and touch base. I'm considering getting those controllers just to be able to test some of the next frontiers of interactions. And yeah, still kind of debating as to whether or not I'll go in and get a headset to be able to try it out. So. Anyway, that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast, and if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listeners-supported podcast, and I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voices of VR. And again, thanks again to Noah Nelson of no proscenium for allowing me to run this as a podcast as well as and check out no proscenium podcast, everything immersive, they have a event that's coming up the Denver immersive gathering from November 4 to six. And he's also got a Patreon at patreon.com slash no proscenium. And again, my patrons at patreon.com slash voices VR. Thanks for listening.

More from this show