#1227: Journey of Making XR Accessible with XR Access COO Dylan Fox

Dylan Fox is the Director of Operations for the XR Access Initiative, where he focuses on outreach and advocacy for making virtual and augmented reality accessible for people who have disabilities. I previously interviewed Fox for the white paper on Accessibility that he co-wrote for the IEEE Global Initiative on the Ethics of Extended Reality.

I had a chance to catch up with Fox at the XR Access Symposium to get an overview of the event, the latest excitement about the accessibility features on the Apple Vision Pro, the XR Accessibility resources page from XR Association and XR Access, the Virtual Experience Research Accelerator, the potential of using AI for assistive technologies like phone-based computer vision projects like Be My Eyes, and the perils of over relying upon AI to fix all accessibility problems, and the potential of beyond computing within rectangles into data that’s physical and within our environment and engaging with objects in our spaces, and computing into naturalistic interfaces. There’s still a lot of work yet to be done to make XR completely accessible, and the XR Access Initiative is on the bleeding edge of doing the preliminary research, participating in interdisciplinary projects, forging relationships with industry partners, and bringing the community together to talk about these issues.

XR Access Initiative collaborated with the XR Association on Chapter 3 of XRA’s XR Developer’s Guide Series that was released on October 27, 2020 that focused on Accessibility & Inclusive Design in Immersive Experiences. The following table is a quick reference guide that cross-references the Accessibility Techniques with the spectrum of disabilities including sight disabilities, auditory disabilities, non-speaking / speech impairments, mobility disabilities, and cognitive disabilities. You can get the full Accessibility Developer’s Guide report here (email required).

A Table from the XR Association's XR Accessibility and Inclusive Design Quick Reference Guide that summarizes Accessibility Techniques that include the following: Removing or Reducing Background Details and Audio
Undo/Redo Functions
Reducing Speed and Setting Up Action Sequences
Bypass Functions
Save Progress
Altering the Size of Objects, Elements and Text
Audio Augmentation and Text-to-Speech
Color Filters and Symbols
Scrim or Scrim-Like Overlays
Captioning Audio Features
Using Icons to Identify Audio Features
Sign Language
Mono Audio
Settings and Menu Options
Dynamic Foveated Rendering and Eye Tracking
Controller-Free Hand-Tracking
Explore World Option

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So this is episode 6 of 15 of my series of XR Accessibility. Today's episode is with Dylan Fox. He's the Director of Operations of XR Access. I wanted to just get a bit of an overview of the XR Access conference, but also what's happening with XR Access in terms of these different initiatives and some of the open questions around accessibility. So I've previously had an interview with Dylan Fox back in episode 1090. That's when I was interviewing both Dylan and Isabelle Gannat Thornton on an IEEE global initiative on the ethics of extended reality report. They wrote up a whole chapter on extended reality ethics and diversity, inclusion, and accessibility. So Dylan's in charge of outreach and advocacy and making VR and AR accessible for people who have disabilities. And so it's a long road and we talk about some of the progress that's been made over the years, but also what else has to be done in terms of the excitement around Apple Vision Pro, but also some of the yet to be done work in order to really make this 3D spatial environments more and more accessible. So, we're covering all that and more on today's episode of the Voices of VR podcast. So, this interview with Dylan happened on Thursday, June 15th, 2023 at the XR Access Symposium in New York City, New York. So, with that, let's go ahead and dive right in.

[00:01:39.950] Dylan Fox: Hi, I'm Dylan Fox. I am the Director of Operations for XR Access. I tend to herd a lot of cats. today on the day of the symposium. But for the most part, I do a lot of outreach and advocacy in the goal of making virtual and augmented reality accessible for people with disabilities.

[00:02:00.067] Kent Bye: Great. And maybe you could give a bit more context as to your background and your journey into this space.

[00:02:04.522] Dylan Fox: Yeah, so I started off really coming to things from the engineering side, actually. I started off in mechanical engineering in Berkeley, got into more cognitive engineering and user interfaces, UX design. And at a certain point, when I was going back to get my master's, I really figured that for all that UX design is supposed to be about user needs, there's a whole swath of people whose needs are often really just ignored, and that is people with disabilities. So I started getting into accessibility, and I've always been very interested in the UX aspect of virtual and augmented reality, because that's just so much to explore there. That's kind of led me to the spot I'm in now of talking to all of the XR folks and teaching them about accessibility, and talking to all the accessibility folks and teaching them about XR.

[00:02:49.636] Kent Bye: And so are you still a professor, or are you doing XR access full time?

[00:02:53.940] Dylan Fox: No, I'm not a professor. I don't actually have a doctorate. DR is just my initials. But I do work for XR Access full-time. I started off as a volunteer, was the head of community and outreach for a while, and now I'm the director of operations. OK.

[00:03:09.468] Kent Bye: And so, yeah, maybe you could give a little bit of what you see where the industry is at when it comes to accessibility. Because it's been a number of years since you've started to do XR Access as an initiative, an effort. Shiri was talking about 2019 was the first gathering. And now we're in 2023, and we just had a presentation today from Alchemy Labs talking about all the different accessibility features they've built into Cosmonius High. So there's certainly some progress, the broader awareness, I think there's certainly still a long way to go. So I'd love to hear your assessment of what's happening with accessibility with XR.

[00:03:42.272] Dylan Fox: Yeah, I think we've definitely seen a lot of progress, certainly an understanding how to make XR accessible, right? There's a lot of challenges to face. There are a lot of novel challenges. You know, even things like captions are pretty well solved in 2D, but come with a whole host of new aspects of them in 3D. And then there's things like, you know, visual accessibility, screen readers in VR that we still are kind of figuring out now, but I've seen a lot more prototypes and proofs of concepts and understanding of how to make XR accessible now than I ever have before. That being said, it's still split very unevenly. And I'm actually very happy to see the Apple Vision Pro debut and their video on accessibility because what they've done that a lot of the other manufacturers haven't is just take all of the basic accessibility features we've come to expect in smartphones and apply them to VR, you know, they included things like screen readers, things like switch control, you know, at least as I'm led to believe by their accessibility video. And while I'm sure that's not necessarily enough to fully capture accessibility in an XR, it's a lot more than what we've seen from some of the other platforms that really seem to have forgotten what we already know about 2D accessibility in their initial offerings. So, I think I'm really excited for that. I'm really excited to see more companies and more platforms start to implement accessibility, especially as we start to use VR and AR much more for education and health care and training, things that are legally required to be accessible. I think we're still going to see a lot more people wanting to learn and understand how they can make sure that their applications work for themselves and their employees and protect themselves from any legal risk.

[00:05:26.413] Kent Bye: I think one of the challenges I've seen in the XR industry is that you have programs like Unity or Unreal Engine that are a little bit more of a black box of, you know, here's an experience, but yet at the operating system level, most times it's actually Android with the MetaQuest 2 and then the Quest Pro. So it doesn't necessarily have a lot of those Android accessibility features baked into the core of the user interface at least not yet But with an Apple Vision Pro you have potentially more of a seamless integration however Christian Vogler of Gallaudet University was warning that we have to be cautious of just porting a lot of the 2D accessibility features and thinking that it's going to be enough within the 3D. So I'd love to hear any reflections on both the translation of 2D to 3D, but also the challenges of things like Unreal Engine and Unity with what's happening with accessibility with these platforms.

[00:06:17.382] Dylan Fox: Right. I think it is certainly true. We have to be careful of not just assuming that all the 2D things are going to solve all our problems, but it's still much better to have these 2D features in it than to not, and to be encouraging your developers to bake those into their applications as well and hook into those interfaces. I think there is a big challenge when it comes to working with all these different platforms, because there's kind of a fundamental difference between web accessibility and VR accessibility in that with the web, pages are built out of HTML, and you can basically just parse through that HTML and kind of understand what's going on to a certain extent. But in XR, you have to go further upstream. If somebody is using Unity or if somebody is using Unreal, the semantics of those platforms are going to be very different, right? If I'm in HTML, I look for h1, h2, h3, and I can start to build an understanding of what the layout of the application is. And that's going to be the same on every web page I visit. But with XR, there isn't really an equivalent of that. If I were to try to go, if it's a Unity app, I might have to go into the Unity tree, which may not even be available to me, depending on where I'm developing. then try to find ways to make sense of that. And so what I think we really need to see is the platforms like Unity and Unreal baking accessibility into their applications, encouraging, prompting developers to make things accessible, and bringing in more people with disabilities to create on their platforms. Because I think the platforms themselves are are not accessible right now, and that makes this cycle of exclusion that can be really damaging in the long run. So I think we need more buy-in from leadership and just kind of across the board in all of these different organizations that are creating the tools for others to make XR experiences.

[00:08:08.067] Kent Bye: Yeah, have you or anyone you know in the accessibility community been in contact with Apple and the Apple Vision Pro?

[00:08:14.347] Dylan Fox: I have reached out, certainly, as soon as I saw the pro information. Apple has been very, as I'm sure everyone listening to this podcast is aware, very tight-lipped about their headset until now. I'm definitely looking forward to reaching out to them, to understanding If the resources we've provided were valuable and how we can update those resources to make sure that everybody developing for the Vision Pro knows how to make their applications accessible because Apple did a great service in making things like revealing whether captions are on as part of their home system, being able to extend that into the applications. They gave the hooks for developers to latch onto to make their applications accessible. And so we're certainly going to be doing our hardest to make sure that all apps released on the Vision Pro take advantage of those.

[00:09:01.863] Kent Bye: So you've reached out, but you haven't heard back yet, is what you imply?

[00:09:04.985] Dylan Fox: It's a bit in the throes of preparing for the symposium as well. So I haven't had a ton of time. But I definitely look forward to talking to some of them as soon as this storm has passed.

[00:09:14.230] Kent Bye: You said that you passed along some resources. I know that you, along with the XR Association, have provided a number of different best practices guidelines. And they have a GitHub. So what are some of the resources that you were passing along to Apple?

[00:09:27.044] Dylan Fox: Absolutely, I mean really we've been trying to gather up everything that people have released about making XR applications accessible That includes both a wide variety from each of the platforms themselves about how to make their own apps accessible But of course also, you know design guidelines put out starting with the XR Association. It includes Nielsen Norman's 10 UX principles applied to XR. Really just a whole ton of different things that people from all corners of the industry have created with a focus on accessibility because we're still, I think, working on the cohesive guidelines, the one guidelines to rule them all. But until we have those, we want to make sure that what people have already created is put to good use.

[00:10:14.561] Kent Bye: Yeah. And where's the best place for people to go to find some of this information?

[00:10:18.505] Dylan Fox: Well, you can definitely check out our guidelines at bit.ly slash xraccess dash github. You can also access that from the xraccess website at xraccess.org. And if people have any questions about it, definitely reach out to us on Slack. We have a really big community. of people with disabilities, of advocates, of programmers. And you can join our Slack at bit.ly slash xraccess dash slack. And we'd be happy to talk to anybody and point them in the right direction for resources related to their particular products.

[00:10:52.611] Kent Bye: Yeah, I wanted to also ask about all the explosion of artificial intelligence, machine learning. I know that, as an example, OpenAI has released Whisper, which is to do speech-to-text, which I've found to be really quite robust and getting transcripts from my own podcast, which is over 1,200 episodes. And so using WhisperX with diarization, and there's always this see how far the technology progresses, but there's always going to be a certain level of manual work. And so at what point do I say, OK, now I'm going to stop Allowing the technology to progress and actually have to do all that many work to clean it up But I'd love to hear from your perspective what you see is happening with some of this stuff with generative AI chat GBT all these tools that are becoming available that seem to potentially have different applications when it comes to accessibility features

[00:11:40.393] Dylan Fox: Absolutely, I mean I'm certainly really excited about some of the potential that they have I'm also a little bit worried that this might be a double-edged sword Because on the one hand there's a lot of stuff out there right now. That is just not accessible You know I mean think about this just in terms of 2d There's a lot of images that have zero alt text and so the alt text that you're gonna get from Trying to mouse over it or your screen reader to it is gonna be like image dot 1 3 5 Jkl blah blah blah Which is absolutely useless and we've seen AI that's capable of potentially identifying what's in that image and adding that as alt text and that is certainly better than nothing and that is a capability that's continuing to improve but The thing about good alt text is that it is written intentionally and it's based on the context of where that image is right, you know if If I'm writing an article about the first president of the United States, then it might have an alt text of George Washington. If I'm writing a dentistry article, then it might say man with wooden teeth. That alt text needs to be contact sensitive and it needs to have that intentionality. So hopefully what we'll see is AI that works with the person that maybe suggest things and then smooth over that process. But I certainly want to make sure that we don't just assume AI will solve all our accessibility problems for us and we don't have to do anything about it. The other aspect of AI that's, I think, really interesting to me is this idea of being able to create and program 3D scenes without having to master an engine, right? And we have right now, again, Unity, Unreal, and a lot of these 3D engines that are very complex to work. And it can be very formidable for, again, screen reader users, for people who are neurodiverse. Anybody with a disability might not be able to do the hundreds and thousands of clicks and lines of code that it takes to make these XR applications. But if we can have just a text or a voice-based interface to interact with an AI assistant that can help do that, that potentially opens up a lot of space for disabled creators. But again, I don't want to use that as an excuse for these engines not having to be accessible in the first place.

[00:13:48.162] Kent Bye: Yeah, that was an interesting dynamic that I saw in one of the presentations this morning in terms of to what degree can the technology make folks who have disabilities more independent and autonomous in the world. And if some of these guides within virtual reality, if they feel like the ultimate guide, as one person said, would be an AI, but yet on the other hand, AI is always going to be limited or biased. And so having a person there to both observe and interpret and more fully contextualize whatever may be happening. But I see there's this trade-off between things where AI is going to be limited. But yet, for some folks, it may actually make them feel like they're more autonomous as they go into these virtual worlds with these AI agents. And so, yeah, I don't know if you have any comments on some of these different trade-offs between this need for sovereignty and independence versus having that full context and maybe more robust perspectives that an AI can't capture at this point.

[00:14:40.312] Dylan Fox: Absolutely. I mean, I think it's never an either-or, right? I think we'll have both. We'll have both human guides and AI guides. That's because there's always going to be things that humans are better at, and there's always going to be things that AIs are better at, right? And I think a great example of this is Be My Eyes. Human guides are obviously great, and historically, if you look back five or 10 years, have clearly outperformed AI guides. Whether that's going to change, who knows? But there's also things that you don't necessarily want a human guide for, right? It's one thing to say, hey friend, lead me down to the local ice cream parlor. It's another to say, hey friend, take a look at this rash I've got and tell me what it looks like. Because there's some things you don't want to share with a human. That opens up another question about privacy. And we absolutely have to make sure our AI respects that. But I think there's definitely going to be room for both. And I think it'd be very interesting to see if we can interweave the two more seamlessly, right? To have AI guides that can operate independently. And then when you need a human guide to just be able to seamlessly pop somebody in and have them help supervise the AI or just talk to the person that needs guidance directly, it's definitely going to be mix and match. And I think that learning process of what is best served by what type of agent is one that we're all engaged in right now.

[00:16:01.130] Kent Bye: What were some of the highlights for you for this first half day of presentations here at XR Access Symposium 2023?

[00:16:08.481] Dylan Fox: Man, I mean it's thrilling to me just to see all these people in the same room, you know 2019 was the last time I'd in-person symposium and that was before I joined XR access. So we've had three years of conferences online and That's a very different vibe. I think it's fantastic to have people here in person I know saying that unfortunately a lot of people weren't able to join in person. So I do hope to have a a better hybrid experience for people that are remote. At some point, I want a VR symposium, but I don't know if the accessibility is there quite yet. But I think we've had some fantastic talks. I mean, we just had Alchemy's talk on the visual accessibility update to Cosmonius High, which I think is one of the first real kind of high-profile, low-vision accessibility modes that we've seen in VR, which is fantastic. We had a great talk about user research and the whole Vera project, a virtual experience research accelerator that should hopefully help to make sure that the people that are testing VR are not all just young white male college students. We had some amazing talks on the guide projects, just so many that I'm just really excited to see it all happen.

[00:17:13.552] Kent Bye: And I had a chance to speak to a number of different posters that just happened during the poster session and saw a number of different demos. And it does seem like that the accessibility is at the bleeding edge of some of the hardest problems that need to be solved still yet when it comes to user experience in 3D UI. So it feels like. a lot of the graduate students and PhD students, this is a topic that is ripe for innovation and exploration for folks. And so I'm not sure if you're seeing that above and beyond what we're showing here, if there's a lot more research that you see that's happening within the broader academic XR research.

[00:17:46.013] Dylan Fox: Absolutely. I think one of the most powerful things about XR Access is that we take the research that's happening in academia and bring it to industry where it can be put to use. Academia is oftentimes years ahead of the curve on experimenting with new interfaces and new assistive technologies, but the problem is there's a lack of tech transfer. These PhD students or professors will create these kind of one-off prototypes that are oftentimes really amazing. They'll gather some data with it, they'll publish a paper, and then that just kind of stays floating in the academic circles. And so I really see it as part of our mission to take all of this amazing research that's happening and make sure that we get it into a place where the public can use it and where the fruits of that research are applied. And I think we're seeing a lot of that here. And I do very much hope that everybody in industry and content creators are paying attention to what is happening in academia. And vice versa, that folks in academia are making sure that their research is being exposed in a easy-to-understand, not completely dry and academic way to all the people that might benefit from it. Because we need better communication on both sides of that so that the right hand of

[00:19:01.336] Kent Bye: XR creation knows what the left hand is doing Yeah, I saw a representative from HTC which is encouraging and but I'm wondering if you've had other industry representation here when it comes to some of the big platform providers

[00:19:14.865] Dylan Fox: Yeah, I mean, I think we have people here from Apple, from Meta, from Google, a lot of the big content creators. I think part of the challenge, though, is that we've had plenty of contact with people at the big orgs. It's just that it's so easy for things to get siloed, right? We may have a really good conversation going with somebody in one group, but it'll never make it to another group where the change needs to happen. That's one of the reasons we keep pushing for accessibility at the highest levels, right? It really needs to come from the top down, from the CEO out to everybody else, so that these accessibility efforts are united throughout the company and not just appearing just in one area or another, right? You know, we talked about Apple being able to bring in the 2D accessibility parts from the start, And that's, I think, an example of accessibility being a priority throughout, right? You know, Microsoft, I love Microsoft for what they've done with a lot of things, like the accessible controller, for example. But you can't use the accessible controller for HoloLens. It's not compatible. And so that's the kind of difference that we're looking at here. We want to make sure that accessibility is spread out and omnipresent.

[00:20:22.759] Kent Bye: Yeah, and Christian Vogler was talking about how he would prefer to see a lot of accessibility functionality be transferred into the haptic experience for folks. And for the Apple Vision Pro, there is no haptics. It's just your hands and eye tracking. So something that may seem like a more seamless user interface, they may actually be abandoning some of the affordances of XR-like haptics if they don't have any type of controller. So that was something that I think is worth pointing out as well.

[00:20:48.929] Dylan Fox: Yeah, and I would agree with that, but at the same time, what gives me hope for the Vision Pro is that I believe they do offer Bluetooth connection to, for example, Switch devices. At least they showed a demo of Switch control being used to control VR apps, and I'm hoping that if they have Switch control, if you connect via Bluetooth to Switches or keyboards, that you can also potentially connect it to haptics devices. And I think that's something that we all need to keep in mind is that everybody has a different setup, right? Some people are going to be happy just using those gestures and hand controls. But other people will want to use a keyboard. Other people will want to use a mouth stick. Other people maybe want some type of haptic bracelets or belt or something else. And if we can create applications and platforms that are customizable and modular and are able to interact with these different things and encourage developers to take advantage of these capabilities, then we can have experiences that can be customized to everybody. Because no two people are the same, no two disabilities are the same, and the ways that people find to use technology are all different. And so we need to respect that.

[00:21:56.437] Kent Bye: Yeah, the only caveat they would have there is that, according to some of the WebGPU meeting notes from December of 2019, Apple and Kronos Group have this dispute, which means that they're not necessarily going to be doing anything with OpenXR. So anything that does have OpenXR integrations is not going to be compatible with Apple. People will have to rewrite everything. So rather than having one API to be compatible with everything, Apple is taking the stance that you basically have to write it for their custom approach So yes that may be possible, but there's gonna have to be a lot of work on the back end I suspect if if Apple even allow some of these peripherals, so that's my caveat and I guess frustration that they're fracturing the XR ecosystem because they're Apple and maybe they're trying to do things from the ground up with their operating system like we were talking about there and Accessibility features are going to be more fully baked than any other thing else that's out there, but at the cost of not integrating with open standards that have been pretty much industry standards across creating these interoperable APIs. So I'm hoping that I'm wrong and that'll change, but we'll see if it creates this fracturing within the industry.

[00:23:03.936] Dylan Fox: Yeah. And you've just hit the nail on the head for one of my biggest gripes with Apple is that when everybody else had just USB-C everything, Apple said, no, no, no, no, no. You'll buy our cords, and you'll like it. And they tend to do something similar with this kind of thing. And I recognize that, in some ways, that's one of the reasons that a lot of blind folks I know, for example, prefer Apple is because it is a standardized experience, because they can know what their screen reader is going to do on a given page. And that is partially because Apple has this iron vice grip and says, everybody, you use our most up-to-date thing, or you don't get to show up on our platform. And that is unfortunate that that's what it takes. Sometimes it seems to make some of these functionalities work. I really look forward to a day when a developer only has to make something once, and it can be usable on different platforms and accessible on different devices. But welcome to capitalism and competition, I guess.

[00:24:03.167] Kent Bye: So one of the other things I guess I saw in the course of the different presentations was a number of, I guess, intermediary groups that are helping to recruit and facilitate and foster relationships with disabled folks who are able to help with user testing in some capacity, whether that's like VR Oxygen or with the Vera project with having a repository of lots of different folks with VR headsets to be able to have social science researchers or other disability accessibility researchers have a group of people that they can prototype and test things with and get feedback in more of a seamless fashion so they're not recruiting new folks all the time. So this distributed network of, I guess it was referred to as the mechanical Turks of user experience testers within the context of VR. But I'd love to hear any other commentary on some of the other different groups that you're aware of that are helping to reduce the friction for if folks are interested in doing usability testing with folks who have disabilities, what are some resources that they can turn to in terms of either having people that have already been recruited or tips for how to recruit folks to be able to test out some of their XR apps?

[00:25:07.600] Dylan Fox: Yeah, absolutely. I think one of the big challenges that we face with XR user testing is that, you know, when you're user testing phone apps, desktop apps, well, most people with disabilities have a phone or have a computer of some sort. But most people with disabilities don't have an XR headset because they're by and large not accessible. So why would they have a headset? And so what that turns into is, you know, when we're doing user testing with disabled people, we have to bring them to our labs, you know, bring them to research bases where those headsets or those other devices exist and then oftentimes support equipping those devices because a lot of them, you know, even if you were to mail a blind person a headset, they may not be able to use it, for example. And so I think we've seen definitely some organizations that have express interest in this at the very least. Vera is obviously a big one. That's the V-E-R-A, the Virtual Experience Research Accelerator that we just had our session on. I know Fable does lots of user testing with folks with disabilities. Last time I checked with them, they said they hadn't received a ton of requests for VR testing, but I do hope that changes in the near future. Always shout out to AbleGamers for the work they've done in the past, and Open Inclusion in the UK. Again, a lot of these groups that have experience working with disabled people, that have disabled people running them or operating within them. And yeah, I think we need more folks to step up and do that. I think we need the organizations that do exist to get more equipped with XR equipment, with the experience of how to set that up. I'd encourage anybody who's listening to read one of the items we have in our XR Access GitHub, which is, again, bit.ly slash xraccess dash github, is the barriers browser by Jamie Knight when he was working at BBC. Because he made a fantastic description of here's all of the things in terms of vision and mobility that made it impossible to try to even do basic user testing with people with certain disabilities. So it's an uphill battle, but I think it's one that we will definitely be getting better at as time goes on.

[00:27:12.047] Kent Bye: Great, and I'm curious what you're really looking forward to in terms of the next steps for where this is going to continue to progress with XR access and increasing the amount of accessibility in the XR industry.

[00:27:25.076] Dylan Fox: Absolutely. I'm really looking forward to certainly connecting with Apple about the upgrades and accessibility they've brought to the Vision Pro, hopefully using that as a standard to get some of the other platforms to up their game. And, of course, seeing how that holds up to fully VR experiences. I've heard Rec Room VR is supposed to launch on the Pro, and I don't know how that's going to work with the accessibility features they've showed. But I think with XR Access itself, definitely look forward to the videos from our symposium, which will be going on our YouTube channel very soon. We're looking forward to more partnerships with organizations, both for-profit and non-profit. We can work directly with these organizations for making their products more accessible. And really looking forward to hopefully getting more people, both abled and disabled, involved in creating accessible XR content. We've had some really amazing programs in terms of our stories project, our prototype for the people project, where we are trying to make sure that it is as easy and as straightforward as possible to make XR and to make XR accessible. So I'm really looking forward to seeing what folks can come up with.

[00:28:29.438] Kent Bye: Yeah, one caveat about the Rec Room and Apple is that they announced at the WWDC keynote when they were announcing the Apple Vision Pro that they were also having integration with Unity. So there's one path where you can use all the custom APIs and everything that is more the equivalent of using Swift code and more like building an iOS app versus doing the Unity path, which is shortcutting some of that, but still some integrations. It's still a little bit unclear to me. I didn't watch all of the different sessions yet. There could be a way that there's still kind of a black box of Unity that doesn't necessarily leverage all those accessibility APIs just yet. So that's all yet to be seen. But there is at least confirmation that Rec Room is not just the 2D version, but it's the full specialized version. So yeah, that's all still yet to be determined as we get more information. So waiting to hear more from both Rec Room and Apple on that. I guess as we start to wrap up, I'm curious what you think the ultimate potential of XR, VR, AR, spatial computing with accessibility in mind and what that might be able to enable.

[00:29:31.039] Dylan Fox: Absolutely. I mean, I think the amazing thing about XR is that You know, up until now, we've done a lot of our computing in rectangles, right? It's like the data lives in the cloud or on the hard drive, and we go there in order to work with it. We do all our interactions with it on a phone or a desktop or a calculator screen. And I think what's really exciting for me about XR is that it's data that is Physical it's in our environment. We can interact with it in ways that we can interact with normal objects in our everyday lives Right, and I think it in my mind. It's kind of the natural progression from you know we went from the like punch card computers to command-line interfaces to GUIs and as we start getting into these more naturalistic interfaces with touch, and especially with VR where you can just kind of reach out and grab objects and interact them like you would anything in real life, I really see it as a democratization of technology where it won't just be the people that were raised learning how to touch screens and click buttons, but anybody, anywhere that can use the power of computing to improve their life. So once we can see that into a place where it really is anybody and not just anybody that's able-bodied, you know, it's a vision I look forward to.

[00:30:48.795] Kent Bye: Yeah, and in talking to Shiri, just reflecting on how we're prototyping some of these assistive technologies in VR, but at some point, they might be able to be translated into, deployed out into the physical world with AR. So I don't know if you have any comments or reflections on that.

[00:31:02.923] Dylan Fox: Yeah, absolutely. I mean, I think a big part of what I was just talking about is things like capturing text that's around you and turning it into speech, right? Being able to see all of the words printed in the real world and convert those into data and convert that data into a form that's accessible to somebody, right? Or vice versa, if there's noise, if there's spoken words, to convert that into data and then convert that into captions that you can read off your glasses, right? There's so much. We don't even think about it as data. But it is data. All the words written on the subway track, that is data. It's just data that is currently static. And AR and VR, they offer us the potential to turn that into dynamic data that can be converted into what people need, what people need to be able to register and use that data. And so that potential, I think, can't be understated.

[00:31:50.858] Kent Bye: Have you been tracking any of the progress of what's happening in VRChat communities, like the mutes who are speaking and having that from speech-to-text, or folks who are using speech-to-text, like translation, to be able to speak in Japanese and have it translated into English? Have you been tracking any of the developments of things that are happening in platforms like VRChat?

[00:32:09.775] Dylan Fox: Not closely, but I really need to I talk about alt space captions to translation all the time as evidence of the fact that when you Put the time investment into making something accessible to people you're also making accessible to machines, right? The work that you do to make accessibility features is almost always the foundation of the work that opens up all kinds of new features So I'll certainly have to take a look at what's going on in VR chat and see what they've come up with

[00:32:34.487] Kent Bye: Yeah, I think that's part of the impetus what you were just saying there of opening eyes with Whisper, creating techniques to get all the text information that's already in these videos and audio. So yeah, definitely worth checking out that. And yeah, I just wanted to give you an opportunity to say if there's anything else left unsaid that you'd like to say to the broader Immersive community.

[00:32:53.230] Dylan Fox: I would really just say to the immersive community, I think, look around you and see if there are disabled people in your circles. And if not, ask yourself, is it because they just didn't want to be there, or is it because they were excluded? And I will bet you 99 times out of 100, it is the latter. So don't assume just because you don't see people with disabilities, you don't see disabled people there, that doesn't mean they don't want to be there. And really try to think about what you can do to make your own community, your own application more inclusive so that you can benefit from the thousands of amazing people with amazing ideas that currently can't interact with your space.

[00:33:36.205] Kent Bye: Awesome. Well, Dylan, thanks so much for helping to organize this whole event and for taking the time to help break down your journey into this space, but also reflecting on all the work that's being done and yet to be done in the realm of accessibility and XR. So thanks for joining me.

[00:33:49.351] Dylan Fox: Absolutely. Thanks, Kent.

[00:33:50.752] Kent Bye: So that was Dylan Fox. He's the director of operations of XR Access. He's in charge of outreach and advocacy and making VR and AR accessible for people who have disabilities. So I have a number of takeaways about this interview is that, first of all, Well, we're still really in the early days of XR accessibility and there's a lot of work that still needs to be done both on the research side as well as companies just implementing different solutions for accessibility. There's a lot of mention of looking at the existing 2D methods for accessibility and there's a whole resource page that XR Access did in collaboration with XRA on a GitHub that lists all sorts of different recommendations of what's happening with XR access. And there's a lot of different W3C user requirements. So there's XR accessibility user requirements, there's synchronization and accessibility user requirements, there's natural language interface accessible user requirements, there's RTC accessible user requirements and accessibility of remote meetings, collaboration tools, and accessibility user requirements, media accessibility user requirements, core accessibility API mappings, and graphics accessibility API. So there's lots of different recommendations and requirements that are being put out that are kind of being all fused together with XR. and even more with trying to figure out how to deal with the 3D nature of XR and both from captions as well as how to use screen readers with this 3D spatial scene graphs that's still yet to come up with a uniform generalized solution that works for everybody and whether or not even all the metadata is even there to provide all the appropriate context. So lots of excitement for the Apple Vision Pro in terms of what that's going to be able to enable. And yeah, he's looking forward to being more in close contact with Apple in order to look at some of the research that has been done from XR Access and also just to see how are the different ways they can start to use the 3D affordances without just copying over all the 2D accessibility tools from iOS. Dylan says that's going to be a lot better than having nothing. And at the same time, a topic that came up again and again is how to expand out into multimodality feedback from the spatial audio into haptics and using different visual cues and whatnot. So there's still a lot of guidance and best practices that still has to be figured out. So yeah, just a general overview of the XR Access Symposium. The videos should be going live here within the next week or two, so you can go back and see all the original presentations. One presentation I did want to call out that I think came up again and again in conversations was the very first one, which was using guidance within a social VR experience, meaning using human guides within the context of social VR. So having someone who is sighted along with someone who has low vision or blind or other visual impairments and to have them be the intermediary who's giving different contextual information for what's happening in these different virtual environments. And yeah, just to help describe and explain different things that are happening, just like you would have a guide in physical reality for someone who might be blind or low vision. So that was something that as we start to move forward, then to what degree are we going to have like artificial intelligence to be able to start to do some of these different tasks? And on the one hand, people that were doing the study, like really wanted to have an AI guide so that they didn't have to feel like they were putting any extra burns on another human being. However, Dylan's cautioning that we shouldn't rely upon the artificial intelligence to fill the stopgap because there's always going to be additional information and context that's coming from the creators themselves. And AI is always going to try to fill in the gaps where it's going to not always have that full context. And usually the human metadata that's going to be entered in is going to be of just a lot more reliability and context and information that's going to be richer and more accurate. So he was referring to these AI tools as a bit of a double-edged sword. And yeah, again, I'd point to the resource page that XR accesses put together that has lots of things to other best practices and guidances. He specifically called out the Nielsen Norman groups. Top 10 usability heuristics applied to virtual reality. And yeah, you can check out some of our previous conversations from the poster session and my conversation with Regina Gilbert to get into a little bit more of those heuristics. And to also check out the XRA developer's guide that was done in collaboration with XRAccess. Chapter three, there's an accessibility and inclusive design and immersive experiences that goes through the sight disabilities, auditory disabilities, non-speech and speech impairments, mobility disabilities and cognitive disabilities, and has a whole broad list of specific heuristics and things to take into consideration as you're designing these different experiences. And so that's definitely worth checking out. There's a table that I'll include into the write-up that you can see, but I highly recommend going to check out the full context for that developer's guide to see some of these different heuristics. And you can get more information on that GitHub page that's pointing to lots of these different resources. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show