Facebook Reveals Latest Wrist-worn Prototypes for XR Input & Haptics

12

Facebook today revealed some of its latest research and vision for a wrist-worn input device that the company expects to form the basis of AR and VR interactions and haptics in the future. The device, still in a research prototype phase, senses electrical signals in the user’s arm to detect intentional inputs. In addition to functioning as a simple ‘button’ input, the company says the device can even enable accurate, keyboard-less typing, and more.

In a media briefing this week, researchers from Facebook Reality Labs Research shared some of their latest work in developing new input technology which the team believes will form the foundation of interactions for XR devices of the future. The group shared a concept video of what it believes will be possible with the technology.

Beyond the concept video, the researchers also discussed the work happening to bring it to fruition.

Input on the Wrist

Image courtesy Facebook

The Facebook researchers seem increasingly convinced that a wrist-worn controller is their best bet as an ‘always on’ wearable input device that can enable “ultra low friction” interactions for XR experiences.

Facebook has continued to build atop the wrist-wearable input technology it picked up in an acquisition of CTRL-Labs in 2019.

The heart of the wrist controller is electromyography (EMG) sensors which can detect the electrical signals which control the muscles in your hands. Rather than just course movements, the researchers say that EMG can be used to sense individual finger movements with precision down to one millimeter. In a video shared by the company, Facebook says the movements of the hand shown below are detected entirely with EMG:

While the near-term use-cases of this kind of technology could be to enable an ‘always available button’ which you can use to confirm choices presented to them through contextually relevant AR systems, the researchers say, further out it will be used for manipulating virtual interfaces and objects, and even to type without a keyboard.

SEE ALSO
Impact Reality Opens 'Flat2VR Studios' to Bring Flatscreen Games to VR

“It’s highly likely that ultimately you’ll be able to type at high speed with EMG on a table or your lap — maybe even at higher speed than is possible with a keyboard today. Initial research is promising,” the company writes. “In fact, since joining FRL in 2019, the CTRL-labs team has made important progress on personalized models, reducing the time it takes to train custom keyboard models that adapt to an individual’s typing speed and technique.”

The researchers shared what is purported to be a live demo of this personalized keyboard model in action, using the wearable prototype to enable reasonably fast typing without a keyboard:

Beyond typing, the researchers say that being able to read finger movements from the wrist could also allow users to manipulate objects. Another purportedly real demo shared shows this in action:

Further in the future the team suggests that users may be able to train themselves to do some of these commands without any physical movement at all.

Although the company says it’s wrist-worn device is a “neural” input device, it draws a distinction between neural input and “mind reading.”

“This is not akin to mind reading. Think of it like this: You take many photos and choose to share only some of them. Similarly, you have many thoughts and you choose to act on only some of them. When that happens, your brain sends signals to your hands and fingers telling them to move in specific ways in order to perform actions like typing and swiping,” the company writes. “This is about decoding those signals at the wrist—the actions you’ve already decided to perform—and translating them into digital commands for your device.”

SEE ALSO
Survios Affirms 'Alien' VR Game is Still in Development

Always-on Haptics

Image courtesy Facebook

For a device which may be comfortably worn all day, the researchers say, a wrist-wearable is also a great place to deliver haptics.

To that end, the company has been experimenting with different types of haptic technologies for the wrist.

One prototype, called ‘Bellowband’, lines the inside of the device’s wristband with quarter-sized bladders which can lay flat or be inflated to put pressure on the user’s wrist. Different haptic effects can be achieved by using different combinations of the bladders or by pulsing them at different rates.

Another prototype, called ‘Tasbi’ (short for Tactile and Squeeze Bracelet Interface), uses six vibrating actuators around the wrist, along with a sort of tension-based, wrist-squeezing mechanism which can dynamically tighten and put pressure on the user’s wrist.

The researchers say that prototypes like these help the company find out which kinds of haptic feedback technology may be worth pursuing.

Contextual AI

A major part of Facebook’s vision for AR and an “ultra low friction” input approach necessitates AI which can deeply understand the user’s context.

“The underlying AI has some understanding of what you might want to do in the future. Perhaps you head outside for a jog and, based on your past behavior, the system thinks you’re most likely to want to listen to your running playlist. It then presents that option to you on the display: ‘Play running playlist?’ That’s the adaptive interface at work,” writes FRL Research Science Manager Tanya Jonker. “Then you can simply confirm or change that suggestion using a microgesture. The intelligent click gives you the ability to take these highly contextual actions in a very low-friction manner because the interface surfaces something that’s relevant based on your personal history and choices, and it allows you to do that with minimal input gestures.”

SEE ALSO
Meta is Working on an Airplane Travel Mode for Quest

That’s largely conceptual for now. While today’s smartphones or smartwatches are able to leverage some clues like, time, location, and connected accessories to infer what actions might be relevant to you, the sort of contextual AI suggestions that Facebook is envisioning will require both an advance in AI as well as the sensor-laden peripherals that can get a real-time understanding of the user’s immediate environment.

More Questions Than Answers on Privacy

Facebook says its goal in building the far future of XR is to build technologies where “the human is the absolute center of the entire experience.”

Achieving the company’s vision will require hardware and software with an intimate understanding of both the user and their environment.

Facebook maintains that it’s committed to transparency throughout the development of these technologies, but admits that it isn’t equipped to asses the broader questions.

“Understanding and solving the full extent of ethical issues [raised by these technologies] requires society-level engagement,” says FRL Research Science Director Sean Keller. “We simply won’t get there by ourselves, so we aren’t attempting to do so. As we invent new technologies, we are committed to sharing our learnings with the community and engaging in open discussion to address concerns.”

Indeed, the company says that a major reason why it’s sharing this information today is to engage the broader tech community on these questions before it moves to take the technology out of the lab and into the market.

“[…] we support and encourage our researchers to publish their work in peer-reviewed journals—and [that’s] why we’re telling this story today. We believe that far before any of this technology ever becomes part of a consumer product, there are many discussions to have openly and transparently about what the future of HCI can and should look like.”

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • MeowMix

    wow

  • Cragheart

    Looks like the future. I’m concerned about privacy. And I don’t like monopolies. And current technology works poorly. Firstly I couldn’t write this, because keyboard wouldn’t open and then voice dictating was rubbish.

    • Carolyn Wilson

      Get $192 per h from Google~a1255~ Yes this can be best since I simply got my initial payroll check of $24413 and this was just of a one week…I have aslo purchased my good BMW M5 right after this payment…~a1255~ it is really best job I have even had and you will not for~give yourself if you do not check it >>>> https://zilo.app/ijQpj ❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤.

  • TechPassion

    The only company which cares about pushing VR/AR forward. The others are totally unserious.

  • psuedonymous
  • wheeler

    Is this the Jedi Controller?

  • mellott124

    This is all cool work but it’s way way out. Like one person mentions, “…when it works…”, which currently is not most of the time. You can try some devices already out there. They have a long way to go to make it stable and consumer friendly. But it’s cool research.

  • Daryle Henry

    I’ll believe it when I see it working properly. VR is awesome, but I’d be lying if I said it lives up to what I always envisioned in the decades I was waiting for it to come to homes. I also have too many memories of getting pumped for some piece of tech that never arrived or when it did was not very good. Things like Sega Genesis VR, Virtual Boy, Kinect, Wii Remotes, Google Glass, Hololens, Magic Leap, et al. If it works I’m pumped but until then I’ll stay off the hype train.

    • calactyte

      I think that’s a pretty reasonable outlook. Sounds like you’ve been burned one too many times. I’m with you on everything you specifically mentioned. Magic Leap was the worst offender. I do feel that PSVR, VIVE and Valve Index have lived up to the hype. Half Life Alyx shows what’s possible. Now we just need more that.

  • alxslr

    That’s cool for when you are in the street, but for all the moments when you can be in front of a table, the mouse is still the most precise and friction-filled input device that exist to interact with information. You don’t have to fight with gravity. You have milimetric precission, partly because your hand is resting and not engaged in that fight. And you have the best action input device: a low effort button in contact with the tip of your finger.

    Also, VR is 3D but interfaces and display of information will mostly be 2D, due to the occlussion problem. So you can make a “Z adaptable XY pointer” like the one in Mindows Mix Reality Home, that cover most cases of navigation and interaction with information.

  • alvaro verna

    awesome

  • Something to finally make the hand interface more reliable. I don’t see the tactile feedback working out, but just tracking the hands dependably is enough to roll these things out today.

    Right now, the hand interface on the Quest is more of a novelty then a useful tool. Just the simple action of pulling a trigger is terribly unreliable. At most it’s good for poking things, at least most of the time. But I could build games around this device.