Eye-tracking Glasses Give Glimpse into Professional Pianist’s Perception

7

For immersive technologies like AR and VR to be effective, they need to understand not only about our world, but also about us. But we don’t even understand us all that well yet; as we capture more biometric information on our quest to lose ourselves in augmented and virtual realities, we’re learning more about ourselves too.

Last month we wrote about a Disney Research project which used virtual reality to study human perception. The task—catching a ball—was simple, but the reality behind how the brain and body coordinate to get a simple task done can often confound our intuition.

Now Function has teamed up with eye-tracking specialists Tobii to equip a professional pianist with eye-tracking glasses to see what happens (see the video heading this article). Pianist Daniel Beliavsky sets about playing some complex arrangements, and the results are quite interesting, and also revealing about how much we have yet to learn about our own perception.

SEE ALSO
Tobii Recommends Explicit Consent for Recording Eye Tracking Data

In the video from Beliavsky’s perspective, a yellow dot shows what his eyes are fixed on. Rather than looking constantly at his hands to ensure each note is landing correctly, his eyes are darting about, often leading the hands to the next location on the keyboard, but also frequently returning to the point between his hands where he’s able to gather information about both at the same time with his peripheral vision.

This video highlights not just the potential usefulness of eye-tracking for input in immersive technologies, but also the challenges of using the raw data to understand intent. While we can use eye-tracking input for passive things like foveated rendering, using it for interaction is a much more complex problem.

Using the input from Beliavsky’s eyes alone, it may be possible to predict the next likely moves, but because of the complexity of how the brain, hands, and eyes are interacting in this case, doing so may come with extreme difficulty.

piano-eye-tracking

If the computer had to guess what the most important thing was to Beliavsky while he was fixated between his hands, it might guess it’s the keys between his hands. But actually Beliavsky was using a combination of muscle memory and peripheral visual queues to make his performance work, while at times the point at which his eyes were directly fixated was not important at all. The raw data in this case betrays the intent of the user. Intent represents a major challenge for the usefulness of biometrics beyond simple passive input and data collection.

The more we learn about human perception, the deeper our virtual and augmented worlds will be able to immerse us. One day, it’s feasible that we might be able to fully emulate input and output of human perception, but today we’re at just the beginning of that journey.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Very interesting, enjoyed that. Would be great to see this working with all sorts of other sports and arts too. Snooker would be a good one. Being able to visualise the memory muscle process like look at the ball, lock your trajectory, look at where you want to aim and the crucial point at when this all maps out in your head before you actually cue the ball.

    • benz145

      Yes I’d love to see that!

  • Casey

    Is that a VR or AR headset in the form factor of a pair of glasses?

    • benz145

      It’s neither, there’s no display, just the hardware necessary to calculate the gaze direction of each eye.

      However, reaching that form factor is expected to become possible in the future:

      http://www.roadtovr.com/lumus-maximus-55-degree-field-of-view-thin-optics-augmented-reality/

    • Anders Öhlund

      I believe they are only glasses that record where you look and a camera that records what you see. So no VR or AR at all. They are made for research into human vision.

  • Anders Öhlund

    This is really interesting. I got a quick overview of tobii when I was there for a job interview last year and they talked about the research their eye-tracking has been used for. Common among most tasks is that when you are a beginner you look at what you are doing. In tennis you look at the ball coming at you and piano you look at your hands. But the more experience you have you start looking more and more at what you are going to do next. Which makes sense.

  • Really interesting experiment…