News

From Brain-Computer Interfaces To Digital Humans: How These Technologies Are Bringing Us Closer To The Metaverse

Just how far away is full-dive VR?

Technology can augment the world around us, it can enhance the human experience, our capabilities, and also extend our reality to digital and virtual worlds. As people flock online during quarantine we now find ourselves experimenting with new platforms, pushing immersive technologies to the limits, and collaborating in new ways; from eye-tracking technology and facial tracking to biometrics and brain-control interfaces. But just how far are we from becoming one with the metaverse and what can we learn about ourselves through sensory technologies?

Brain-Computer Interfaces

Brain-computer interface systems like Neurable’s VR game Awakening uses an electrode-laden headband, connected to an HTC Vive HMD, to track brain activity. The software analyzes the data to figure out what should be happening in a game. In 2017, the MIT Technology Review wrote that only one year later this technology could be mainstream and picked up by VR arcade companies. It was an optimistic estimate as consumers and hardware alike have required a transition and adoption phase.

CTRL-Labs takes another approach to HCI (human-computer interfaces), using a simple electrode-studded wristband to read neural signals from the arm. In June 2019, CTRL-Labs acquired patents from Myo, a similar wearable created by North that enables control of robotics and PCs via gestures and motion. Shortly after in September 2019, CTRL-Labs was acquired by Facebook, becoming part of the Facebook Reality Labs team to build the technology as a consumer product. With Oculus having launched hand-tracking on the Quest this past December, I wonder how far we are from consumer-ready mind-control systems.

Perhaps more widely known in mind-control systems is Elon Musk’s Neuralink. It’s by far the most invasive technique, one that involves inserting 96 threads into the brain with micron precision. It’s a risky business, but also an absolute reality. “An integrated brain-machine interface platform with thousands of channels,” by Elon Musk and Neuralink offers great insights into how it can replace typing, clicking, or even talking as a form of digital telepathy.

Haptics

Moving from mind to body, the Teslasuit is a non-intrusive HCI system centered around a suit material that uses a piezoelectric EAP (electroactive polymer) to record electrical signals generated by players’ muscles to animate avatars in VR. As new haptic and biometric sensory technologies become more refined, more sophisticated interfaces will begin to emerge. 

Low profile tactile, haptic, and thermal displays that can be used in a variety of applications are already in development. At the Cutaneous Sensory Lab in the Department of Mechanical Engineering at MIT, researchers are conducting a variety of psychophysical studies to determine how users perceive these various forms of stimulation. The 2019 study “Closed-Loop Haptic Feedback Control Using a Self-Sensing Soft Pneumatic Actuator Skin,” published in Soft Robotics, also presented a solution for wearable haptics as a soft, flexible artificial skin made of silicone and electrodes that self monitors to provide accurate haptic feedback to a user’s body. 

In the market today, with an acute level of tactile feedback, the HaptX gloves have remained an all-time favorite of mine for realistic touch in VR. It enables users to feel the shape, texture, and motion of virtual objects with 130 points of force feedback. Size, weight, impact, and even temperature (with a bulkier version than the standard enterprise glove) is possible and truly astounding. In January 2020, Haptx announced a Series A financing round of $12M. 

AR Lenses 

AR glasses like Nreal and MAD Gaze indicate a shift away from clunky VR HMDs to a viable consumer option for day-to-day applications, public, and collaborative use. Nreal’s recent partnership with Clay AIR, announced March 2020 to integrate hand-tracking and gesture recognition, indicates another greatly anticipated movement to intuitive interactivity that blends the real and virtual world beyond the Oculus ecosystem. 

Just as we see with the miniaturization of haptic products, AR headsets are also seeing their next-generation counterparts enter the consumer market in the form of AR contact lenses. These include InWith Corporation’s blinking-powered lenses, and Mojo Vision’s contact lenses, which raised $58 million in a Series B investment round in March 2019. 

While VR can be traced back to the mid- 20th century, it was only in 2010 that Palmer Luckey launched the Oculus Rift Kickstarter. Ten years later, we have a healthy immersive ecosystem where AR is quickly becoming a ubiquitous and omnipresent application. 

Eye-Tracking 

Recently, Antony Vitillo of The Ghost Howls spoke with Lars Bergstrom, Mozilla’s Director of Engineering Mixed Reality, on the subject.

‘“Eye-tracking has the potential to expose an individual’s intrinsic characteristics, such as race, age, gender, and sexual preferences, as well as revealing sensitive health characteristics such as whether they have autism or disorders like anxiety and depression,” said Bergstrom. “We have very little conscious control over our eyes, so the idea of providing unfettered access to data that can reveal all of this information (as well as provide unique user fingerprinting) is antithetical to our values of treating privacy as a first-class citizen. Instead of providing access to raw gaze data, the web browser can act as an opinionated user agent — an intermediary between the application and the device sensors and resources. In this example, perhaps we would only reveal that a user had looked at an element on a page if they’ve dwelled on it for a certain period of time.”

How will this type of data affect employment, healthcare, relationships, and broader life matters? What is your immediate reaction to the above in the hands of private corporations and public individuals? What is the solution to data privacy?

Digital Humans 

As we stroll confidently through pixelated grassy fields, feeling, seeing, hearing and interacting with environments as if they were real, we are beginning to encounter more advanced NPCs and bots that could be either human or machine. In January 2020, Soul Machines announced it had raised $40M towards their AI-powered customer-facing digital avatar technology, described on their site as “a Digital Brain that provides Digital Humans with the ability to sense, learn and adapt.” We are already encountering these digital humans worldwide, and witnessing massive funding moves in this vertical during the recently announced pandemic. I include digital humans in this list of immersive and multisensory technologies because, like the way we interact with one another, the way we interact with digital humans (whether they are known to be so or not) is another telling tale of our own character. 

Synesthesia, Data & Dopamine 

Frans Evers’, The Academy of the Senses, describes synesthesia as the ability for our senses to be “rerouted” via neuroplasticity, aka the brain’s ability to change and adapt as a result of new situations. This means that we can feel through sound, hear through touch, and see through our taste. Our senses are a fascinating playground for experiences that inform our sense of reality. Even if we don’t use all of the above technologies concurrently, there are endless possibilities of how our perception can be altered and enhanced. 

As we move towards a much more personalized online experience with sensory and immersive technology, we will soon be able to achieve the most incredible human experiences. While we are largely desensitized in the consumption of media today, multisensory immersive technology will forever change the way we measure the success of online campaigns and engage with the digital world.

Building Our Future 

As global movements begin experimenting more heavily with online worlds and virtual meet-ups, we are beginning to see the groundwork for the next era of digital humans (ourselves).

A shoutout to all those who have been actively collaborating with others in VR on multiple WhatsApp XR chat groups.

About the Scout

Anne McKinnon

Anne McKinnon is an independent XR consultant and writer. She is actively engaged in innovation at the intersection of music, the arts, gaming and tech. Anne is US project lead for the band Miro Shot.

Send this to a friend