Looxid Labs LooxidVR emotion reading VR

LooxidVR is a virtual reality headset that aims at analyzing your emotions

When I wrote my post about CES 2018, I briefly talked about a headset called LooxidVR aiming at “reading the emotions” of the user by employing a combination of EEG sensors and eye tracking. Looxid VR has also won an innovation award at CES.

looxidvr eeg virtual reality ces
The device has won the Innovation Award at CES (Image by Different Impulse)

Today we are approaching CES 2019 and I have had the enormous pleasure of talking with Brian Chae, the CEO of Looxid Labs about his headset and the importance of Brain Computer Interfaces. Enjoy his interesting words!


Hello, present yourself to my readers!

Hi guys (and girls), my name is Brian Yongwook Chae, CEO of Looxid Labs. I founded biometric data based AI start-up Looxid Labs in 2015. 

What is Looxid Labs? Describe your product in detail

Looxid Labs is an AI start-up aiming to integrate an emotion recognition algorithm into HMD systems through eye-tracking and brainwave signals. Our goal is to become the leading authority in HMD-user emotion recognition and analysis by developing emotion-aware AI connected to the human eye and brain in VR.

In order to provide an interface for both the eyes and the brain, we developed our product LooxidVR, the world’s first mobile powered VR headset combined with eye-tracking cameras and brainwave sensors. Our VR system enables researchers to directly build VR environments through our Unity SDK and track, as well as detect, physiological signals. 

Our product LooxidVR has three key features:

  • Hardware usability: What’s unique about LooxidVR is that it helps researchers integrate VR into their research in the simplest way. No longer do researchers need to invest time and energy into setting up different types of equipment (VR headsets, EEG sensors, and eye-tracking cameras) for each purpose. LooxidVR enables them to collect the user’s eye and brain data simultaneously during a VR experience. LooxidVR offers a solution with its comfortable design – headset foam pads, dry EEG electrodes, and built-in eye-tracking cameras. By simply wearing our headset, subjects can be fully immersed in the experiment setting, making the output of the research much more accurate and reliable.
  • Automatic Time Synchronization: LooxidVR facilitates time synchronized acquisition of eye and brain data, as well as VR contents and interaction data (Unity event logs). Once the researchers in cognitive neuroscience, psychology, and BCI fields create VR contents in the Unity platform, LooxidVR enables the researchers to easily get their subjects’ time-synchronized eye and brain data through our Unity SDK. In particular, by simply setting the start/end times for experiments, the user’s raw eye and brain data are synchronized with the time when the actual stimuli are presented. During the experiments, the eye and brain data transmitted from LooxidVR will be displayed on the screen of LooxidVR application, which helps researchers toeasily check and manage the user’s status as well as the progress of their experiments.
Looxid labs LooxidVR virtual reality emotions
The interface from which the researcher can access all brain and eyes data (Image by Looxid Labs)
  • Easy-to-use API: LooxidVR helps researchers utilize their own VR contents/applications in their human behavior research by offering its easy-to-use API, which instructs them to easily connect their apps with LooxidVR according to their needs. The researchers can implement reliable data acquisition software into their experiments through our API. In addition, our API allows researchers to store the subjects’ eye and brain data into their desktops over a USB connection in real-time. Through itsAPI, they can directly access the subjects’ raw eye-tracking and EEG data recordings with the event logs. Furthermore, researchers who buy the LooxidVR are notified of any further API updates.
Looxid labs LooxidVR virtual reality emotions
Full specifications of the headset (Image by Looxid Labs, click to enlarge)
How is the setup process of the headset?

You can find the setup process of the headset at our how-to page as follows: 

Looxid labs LooxidVR virtual reality emotions
This is the high-level scheme of how a LooxidVR setup works. Take in mind that the headset can’t work with all the smartphones, but only with few high-end ones (more or less the same that are compatible with ARCore) (Image by Looxid Labs)
It works using a smartphone… why haven’t you produced a standalone headset?

Instead of developing our own standalone headset, we’re developing a compatible VR EEG mask and eye-tracking sensors as an accessory kit for Vive Pro and standalone VR headsets such as Oculus Go and Vive Focus. It will be coming next year. We will keep you updated with the release of our accessory kit. 

How does the emotion detection process work?
  1. We seamlessly measure VR users’ brainwaves and infra-red camera images once the users just wear our VR headset LooxidVR where the brainwave sensors and eye-tracking cameras are attached.
  2. The raw physiological data measured by LooxidVR is preprocessed using brainwave noise reduction and pupil detection algorithm. The preprocessed information is sent to the connected desktop.
  3. The machine learning algorithm is applied to the preprocessed information to detect the users’ emotional states.
  4. Through the analyzed data, our clients can see where the users have looked and how they felt while they are experiencing VR contents in the datadashboard.
Looxid labs LooxidVR virtual reality emotions
Headset connected to the PC for analysis of eye and brain data (Image by Looxid Labs)
Are there APIs for interested devs?

We provide the developer’s guide to include how to develop VR applications in Unity, easily store and manage the physiological data and implement the timestamp-based data synchronization.

You can find the detailed information at https://looxidlabs.com/looxidvr/docs/unity-sdk/quickstart-for-looxidvr-sdk-for-unity-with-android/

Tell me some key use cases for which your device has been used         

Our device has been used to develop emotion AI for biometric information-based marketing – to measure consumer’s preference – and mental care solutions – to assess theemployees’ job stress at work. 

What kind of things is it possible to detect using EEG sensors and eye tracking? And what kind of things isn’t instead possible to detect?

Let me explain that with a few use cases:

  • Healthcare: For psychological or neurological disorders such as depression, ADHD, and Alzheimer’s disease, our device helps researchers broaden their researches by tracking each patient’s mental states during tasks involved with VR therapy and speeding up his/her treatment and recovery;
  • Education/Training: Our device contributes to measuring students’ / employees’ moment by moment engagement with VR educational / training contents and the level of understanding, as well as offering valuable insights into an effective adjustment of their learning process;
  • Marketing and Consumer Research: Our device can be used to reveal genuine, spontaneous emotional responses of consumers to a new product or advertisement in a virtual setting, and thereby acquire more valuable insights into consumer behaviors and needs.
  •  BCI research: Using our device, researchers can evaluate and optimize user experience in VR, a perfect test-bed for new interface prototypes, as safety is maintained and expenses are reduced, using eye-tracking and brain activity analysis.
What is the future that you envision for BCI in general? And how do you answer to the critics about the possible misuse of thetechnology (privacy concerns, manipulation, etc…)?

BCI will not only transform human interactions with computers but also, more importantly, facilitate or even restore damaged hearing, sight, and movement. VR could act as perfect test-bed for BCI before the interface is ever implemented physically. Also, skills or knowledge acquired in VR can influence the real world behavior and performance. When BCI is coupled with VR, BCI does not merely remain as a medium that bridges assistive devices and the human brain. It can indeed facilitate therapy for all sorts of human sufferings -from instilling positive self-identity to curing serious psychological disorders to even promoting physical rehabilitation – or even reverse the neurodegenerative diseases such as dementia in the future. In other words, VR can not only allow ergonomic evaluation of a BCI prototype and the transfer of such BCI to the real world but also relieve patients of both physical and psychological pain. 

At its inception, BCI came into the spotlight for its befitting interface for patients who have almost no mobility. However, BCI now promises a wide array of applicability in education, marketing, security, and even games and entertainment, and Looxid Labs is poised to bring that future to life.

Looxid labs LooxidVR virtual reality emotions
EEG sensors inside the LooxidVR headset (Image by Looxid Labs)

Looxid Labs developed the all-in-one mobile VR headset, LooxidVR, embedded with EEG sensors and eye-tracking cameras. LooxidVR helps researchers to acquire the user’s robust EEG signals and trace his/her pupil dilation and saccades in time series. Above all, LooxidVR provides time synchronization of eye-tracking data ,EEG signals, and the VR content, sparing effort to adjust the time for eachmodality. Most of the BCI researches aims to provide solutions for vulnerable populations, such as individuals with brain injuries or disorders including stroke patients, sufferers of cerebral palsy, and locked-in patients, contributing the public good, although, in terms of theusage of BCI, there are some critics about security and privacy issues related to the misuse and discrimination based on BCI data. It’s not far from the double-edged sword of social media. As online privacy and security is ashared responsibility among government, industry, and users, individuals,companies, the tech industry, and the government should do something for BCI to increase privacy and security. 

What is the price of this marvel? Are there any kindof discounts for interested devs?

The original price of this product is $2,999,99. For those who are interested in our product LooxidVR, we hosted the LooxidVR Happiness Challenge to encourage and support the research and business ideas to bring happiness to the world by providing developers or researchers with a LooxidVR. The closing deadline is December 17, 2018 (EST). Please visit our website https://looxidlabs.com/looxidvr/happiness-challenge/ for further details!

Looxid labs LooxidVR virtual reality emotions
Happiness for everyone! (Image by Looxid Labs)
What would you advise to people wanting to enter the fascinating brain-computer-interfaces field?

VR will be a game changer for brain-computer interfaces (BCI). VR can provide a mediated yet immersive environment to alter the behavior and cognition of an individual, and the BCI will enable the individual to control the devices and enter texts in VR as Facebook announced typing by brain project.  By combining BCI with VR, developers can implement seamless control of the virtual reality system.

What are your future plans?

Our goal is to become the leading authority on HMD-user emotion recognition and analysis based on human physiological data, by developing emotion aware AI connected to human eye and brain in VR / AR / MR platform.

In pursuit of our goal, we will enhance the accuracy of machine learning algorithm by collecting a vast amountof emotion data in the VR environment and use the information in various industries such as BCI, healthcare, marketing research, and gaming. 

looxidvr eeg virtual reality ces
LooxidVR headset photo and features (Image by Looxid Labs)
Anything else you do want to add to this interview?

We have been working to develop B2B reliable solutions for marketing and healthcare industries. The core value of Looxid Labs solution is in discovering VR users’ unspoken emotions through biological responses measured during a VR experience. Based on the brain and eye data collected, an emotion AI learns about the user’s emotional status. Then, this model can be applied to a VR content which a client requests for analysis. Please stay tuned for our updates regardinge motion AI solution that provide quantitative insights into the VR users’emotion!


Mind blowing, isn’t it? A device that can read your emotions in VR and that can have huge applications in psychology, in rehabilitation and in the well-being of workers. You know that I am a big fan of BCI, so I’m really intrigued by LooxidVR. As I discussed with the fantastic Donald Greenberg, I am also afraid of the misuse of these technologies, and I hope that next generations will be educated in using them properly.

LooxidVR looks very interesting and of course, given its price, it is a product for enterprises. So, if your company can find benefit in using it, I advise you to contact directly Looxid Labs or in participating to their contest to have a free headset!

And I have some mind reading powers too, and I can feel that you want to subscribe to my newsletter by filling the form below… 😉



Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

2 thoughts on “LooxidVR is a virtual reality headset that aims at analyzing your emotions

  1. Sounds like an interesting device for doing research on psychology, phobias and anxiety treatment, etc. I’m wondering how well it does with the eye-tracking thing for a random user. We already now that having realiable eye-tracking that works for everyone is tough as hell. Do you know if the eye-tracking module is custom made or it’s just one from the world leading companies in that area (Tobii, 7Invensun, etc)

    Also, always wondered which kind of EEG data this kind of devices provides at the end. I mean, what this kind of things refers to with “mental/emotional states” data? It’s just the raw data coming from the EEG sensors so they can be analyzed by a neuroscientist or it’s actually data that even we developers could understand (I imagine data in the form of anxiety and stress levels, and so on)? Couldn’t see anything other than the raw data in the dashboard.

    (I need one of those Chinese capsule sofas btw)

    1. We want those chinese sofas!

      Anyway, as far as I understood, currently the headset just gives raw data. But the company is working hard to create an AI that mixes the data of brain and eyes to come out with more high-level data regarding the emotion of the user.

Comments are closed.

Releated

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]

quest 3s leak

The XR Week Peek (2024.03.19): Quest 3S and Pico 4S leaked, and much more!

Today is Father’s Day in Italy, so I want to send a big hug to all the fathers out there, wherever they may be.  Life is busy these days, but in April I will have more time to collaborate with you all. For this reason, I’ve given a refresh to the pages where I talk about […]