Is AI/AR a Paradigm-Shift for Patients?

Australian research showcases the power of AR and AI solutions, highlighting broader workplace/consumer benefits

7
AI/AR is a Paradigm-Shift for Patients
Augmented RealityInsights

Published: November 6, 2023

Rgreenerheadshot

Rory Greener

Smart glasses vendors increasingly market AR devices as a paradigm-shifting hardware landscape for business and consumer clients. 

Leading AR device representatives like Meta, RealWear, and Qualcomm are highlighting how lightweight immersive hardware can streamline work tasks with assistive reality visuals. 

Moreover, immersive healthcare solutions partially drive the industry forward, thanks to new opportunities presented by XR training, telehealth, and more.  

While AR solutions can significantly benefit healthcare professionals, XR also has the potential to revolutionize the patient’s side.  

Recently, a massively revolutionary benchmark was made by Australian researchers at the University of Technology Sydney and the University of Sydney to produce an “acoustic touch” headset.  

The researchers created the device alongside local start-up ARIA Research to leverage AR/spatial computing hardware to assist low-vision individuals with navigating spaces and interacting with objects. 

The “acoustic touch” devices detect a user’s environment with spatial tracking cameras and then send informational audio feedback to a low-vision user. 

The research division cites World Health Organisation data noting that roughly 39 million people worldwide are blind and 246 million have low vision – with AR/AI solutions, this group can gain new opportunities to interact with the world around them.

The research team is working to assist the massive demographic with an immersive solution to help with everyday tasks. 

Commonly, smart glasses are presented as a solution which can assist with tasks such as cooking or navigation by giving visual immersive information panels, for example. The Sydeny-based research proves that the technology can continue to lead in accessibility and allow a wide range of users to benefit from the technology as XR becomes more ubiquitous in daily work and personal life.    

A New Way to Interact  

Spatial computing allows users to interact with digital content in new ways, thanks to the 3D surrounding immersive environment. Moreover, with AR visuals that overlay on a user’s environment, spatial computing also changes how users interact with the real world.  

University researchers are using spatial computing and the way the technology scans a user’s environment to develop its “acoustic touch” smart glasses.  

The “acoustic touch” uses smart glasses’ outward-facing cameras to detect real-world environment and object data, relaying this as audio feedback to assist a low-vision individual with navigation. 

Chin-Teng Lin, a global leader in brain-computer interface research from the University of Technology Sydney, added: 

Smart glasses typically use computer vision and other sensory information to translate the wearer’s surrounding into computer-synthesized speech. However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device’s field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone. 

Accessible AR for Low-Sight Proves Positive Outcomes 

Sydney researchers are developing the device alongside research into the effectiveness of acoustic touch technology across low-vision individuals. 

Dr Howe Zhu from the University of Technology Sydney recently published a research paper in the PLOS ONE journal, which studies the outcomes of acoustic touch technology across 14 participants, some with low vision. 

During the study, the University’s researchers tested their acoustic touch headset on seven individuals with blindness or low vision. Meanwhile, a control group of seven blindfolded sighted individuals took part. 

Dr Howe Zhu concluded that the acoustic touch device assisted the seven individuals with blindness or low vision in recognizing and reaching for objects – using the device’s spatial information to audio assistant pipeline.  

The researchers also found that the process of identifying and finding objects did not provide a negative mental strain on the seven individuals with blindness or low vision. 

Dr Howe Zhu explained: 

The auditory feedback empowers users to identify and reach for objects with remarkable accuracy. Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community. 

The researchers noted that assistive reality technology and its integrated technology can help low-vision individuals improve their quality of life with daily assistance. 

Nreal, Microsoft Support Blind Children with AR 

XR is still very much emerging, and healthcare research is still immature. However, the benefits are coming in quickly.  

Healthcare and medicine are quickly picking up in the XR space to improve quality of life standards for patients and those under care.  

However, as research continues, positive or negative results will come soon. However, to back up Sydey’s recent positive developments, last year, Microsoft and smart glasses vendor Xreal – then Nreal – partnered to develop an AR/assistive reality solution to improve social interaction among blind children.  

Like Sydney’s acoustic touch device, Microsoft and Xreal also created a research technology that assists low-vision children with locating and identifying others by leveraging spatial computing and audio feedback. 

The 2022 partnership resulted in PeopleLens, a head-mounted device (HMD) that sits just above a wearer’s eyes and uses spatial detection to locate people in a room, then provides an audio cue which reads the names aloud of individuals when a child looks at them. 

Using spatial audio, the HMD locates individuals in a space to create a mental map of a child’s surroundings, which Microsoft called a People Map. 

Xreal provided its Light AR smartglasses at the time. The product leveraged a digital line of sight to detect the location of individuals and relayed this back to the user, enabling blind children, or those with low vision, to understand a group’s relative position and distance. 

PeopleLens was also ahead of the curve, utilizing an AI platform to track the wearer’s surroundings in real-time. 

Microsoft researchers are using four state-of-the-art computer vision algorithms to understand the surroundings of a child with low vision, where the AI platform can locate, identify, track, and capture the eye-gaze of nearby peers. 

The AI platform also stitches together a digital world map based on recorded real-world data, where the headset translates real-world data to the wearer via spatial audio notifications to create the PeopleMap. 

Microsoft gained positive outcomes based on the study. Following this, the firm went through a recruiting drive to find children in the United Kingdom between the ages of 5 and 11 to enhance the product as part of a multistage research study led by the University of Bristol.  

AR: The Eyes and Ears of AI 

The research into AR smart glasses to assist low-vision individuals comes as AR entangles itself alongside AI to help a wide range of end-users with daily tasks. 

Last month, at Snapdragon Summit 2023, Meta took to Qualcomm’s stage to discuss the ongoing integration of XR into daily life. 

The Head of Hardware Partnerships at Meta, Jenniffer Hom, took time to cover its Ray-Ban partnership, resulting in the ongoing development of an AR smartglasses portfolio.  

The Ray Ban smart glasses aim to improve daily life by providing AI assistants who assist with daily tasks.  

Using AR detection via outward-facing cameras, the device can organically respond to the user’s surroundings – like contemporary devices, such as the Syndey acoustic touch device. 

Jenniffer Hom noted: 

Smartglasses will become the eyes and ears for virtual AI assistants, allowing people to experience AI in the most human form: with our senses. Guiding you through the preparation of your next family meal, helping older parents read a menu with small print, or removing barriers between two strangers by translating their conversation in real time. I am super excited to announce we have multiple OEM partners developing smart glasses based on Snapdragon AR1. These glasses will not only feature camera, audio, and AI capabilities but will also feature a monocular or binocular display. We expect to see these designs by these customers launching next year. 

Moreover, Hom said that “very soon, you will be able to ask your Meta AI a question, and the AI will answer you with the context of seeing and hearing through your smart glasses.” 

On the other hand, Microsoft shared a similar sentiment.  

In August this year, Lili Cheng, Corporate Vice President of Business Applications and Platforms at Microsoft, wrote on the keys to MR success for frontline workers, highlighting how AI is a core technology behind XR’s workplace success.  

Smart glasses and easily accessible XR solutions provide an immersive toolset suitable for frontline work. Moreover, due to a lighter form factor, smart glasses and assistive reality devices are gaining massive ground in maintenance and other frontline roles thanks to hands-free data visualization. 

Cheng explained that in industrial frontline environments, valuable insights are “confined to individuals, groups, or departments,” however, AR/AI has the power to “shift this dynamic.  

Cheng also added: 

Workers can share real-time, situational video of their environment, allowing others to experience it firsthand—regardless of location. Expert guidance, troubleshooting, or step-by-step instructions are immediate. This eliminates the need to travel, while minimizing downtime and production disruptions. 

The Microsoft exec mentioned the importance and roles of integrated technologies such as digital twins, explaining that RT3D visualization and data allow frontline workers to “receive a better understanding of the machines and processes at hand.” 

Cheng also added that access to related materials, such as digital twins, can be “further enhanced by AI,” saying: 

Mixed reality is the eyes and ears of AI. Delving into operation nuances for personalized, in-depth learning becomes easier. When integrated, mixed reality and AI accelerate worker training, shortening steps and supplying users with the working knowledge they need for the task at hand. 

Whether it’s consumer, enterprise, or healthcare, a new trend appears to be AI-powered AR smart glasses.  

Leading XR technology vendors are appearing to merge AR and AI technology for better products. But the joint technology investment also supports AI projects that Meta, Microsoft, and Apple are working on in the background.  

Assisted RealityHealthcareVisualizationWearables
Featured

Share This Post