nextmind review vr

NextMind Review: select objects using your brain powers

You all know that I love Brain-Computer Interfaces, and so I have been very happy when NextMind has proposed to give me a sample of its brain-reading sensor to review here on my blog. NextMind has been of the most popular gadgets at CES 2020 and it is for sure one of the most interesting startups in the BCI landscape. But does its device live up to the hype? Well, let’s discover this together!

[PS Before starting, would you mind joining my Patreon so that to support my hard work in informing the XR communities with detailed reviews like this one?]

NextMind Video Review

As usual, I have also prepared a video review of NextMind, with a thorough analysis of its features, its SDK, and also a video showcasing to you all the technical demos that the company is offering together with the sensor. If you like detailed video reviews, this one is for you!

If you are all in for text, well, let’s go on with a textual review.

What is NextMind and what is not?

Let’s start by clarifying what NextMind is and what it is not. After the startup has started its marketing push at the beginning of this year, I’ve seen too many Youtube videos titled with captions like “THIS IS THE FUTURE”, “BRAIN CONTROL IS HERE” or other bombastic things. Well, let me tell you that this is not exactly the truth. BCIs are still in the early stages, and even if they are evolving at a good pace, the technology for “brain control” is still not here. OpenBCI confirmed on this blog what Facebook’s Bosworth already said in another interview: probably we’ll have very early stages of brain-control technologies in our VR headsets in 3 years from now. So, at the moment, forget about mind control, and even about “mind reading”, which probably will require many more years (decades?) to become reality (and I know, Zuck can’t wait for this).

NextMind is a device that lets you select items in a virtual scene with your brain. That’s it: it tries to do one single thing and do it very well. It is a device conceived only for this purpose and in fact, it doesn’t even offer what other present BCI devices propose, that is the ability to have the rough brain-waves readings, or the ability to understand the emotional status of the user (concentrated, relaxed, etc…). It just understands among the objects in the scene in front of you, which one of them you are looking at, so that you can interact with it. Think about it as an eye-tracking device that instead of looking into your eyes, reads inside your skull. It is a sensor for object interactions that can be used inside innovative UI/UX, in VR and not. And doing just this, it is already very cool.

Clarified what it is about, let’s dig into all the details of NextMind!

Unboxing

I know that you’re big fans of my unboxing videos, and of course, I have one also for NextMind. Shooting it was a bit of a pain because the first unboxing video got lost, so I re-shot it, and the second video had some parts misteriously cut-out (In fact you can hear me just say “I’m Tony” and not “I’m Tony the Skarredghost”, as usual). Thinking that it was a doomed video, I decided to leave it as was. Probably it is a thing as “The Ring”, and if you watch it, in 7 days you receive a sensor at home, I don’t know 😀

The doomed NextMind Unboxing Video

The Unboxing procedure has been very simple because the box just contains the sensor, a head strap, and a USB-cable. But I have to say that I liked it: the green and black colors that dominate the box suit the device, and the box is very elegant. Inside, everything is put in a well-presented and ordered manner, and it is also very easy to put everything back in the box in its original configuration, for easy storage. The first experience is already very pleasant for the user.

Design

My mother says that NextMind looks a bit like a spider, and now I can’t unsee it :). The sensor is all black, with a big circular body that contains the circuitry of the sensor, and then an arc that contains nine EEG sensors, that communicate their data directly to the “body”.

Nexmind sensor
Overview of the “spider”
eeg nextmind
Bottom view. All these tiny feet are the EEG sensors of the device

On the circular part, there is a USB-C port used to charge the device on one edge, and a button + status led on another edge. The button is used to turn on/off the NextMind and also to start the Bluetooth pairing.

Status led and main button of the sensor
USB-C port. It is used just to charge the device, because all the communication is via Bluetooth

Specifications

These are the official specs of NextMind:

  • Dimension: 135x66x55mm
  • Weight: 60 grams
  • Headband size: 54 cm to 62 cm
  • Clip-on system: can be clipped directly on a headband, a cap, or a VR/AR headset
  • EEG: 9 high-quality electrodes
  • Battery: Lithium-Polymer battery (3.7 V 240 mAh)
  • Connectivity: USB-C Connected to a computer or any USB charger (5V DC)

The sensor is compatible with PC and Mac. Here you are the specifications for a compatible PC:

  • Graphics: DX9 shader model 2.5, Intel HD 2500 equivalent
  • CPU: Intel i5-4590, AMD FX 8350 equivalent
  • RAM: 8GB
  • Bluetooth LE support (4.0)

As you can see, it doesn’t require a stellar computer to work. Of course, if you want to use it together with a VR headset, you have to own a VR-ready PC.

Notice that it is only compatible with PC and Mac, and not with Android. This means that at the moment you can not use NextMind with the Oculus Quest standalone (you can use it with Quest + Link, of course). I guess that maybe in the future some support may be added (the Quest supports Bluetooth connections), but the runtime should be ported to Android and optimized for a mobile system.

Setup

Do you remember when in the beginning I told you the NextMind does one thing but it does it pretty well? Well, the setup is exactly in line with this approach. Installing NextMind is very easy, in VR or not.

On the hardware side, the only thing that you have to do (apart from charging the battery), is attaching the sensor to the headband, and put your headband around your head, making sure that the sensor stays on your nape. After you have done it, you have to gently move the device up and down so that the EEG sensors contained in it can touch directly the skin of your skull and not your hair. This is necessary so that to guarantee a more reliable data read from your brain.

If you want to use it with a VR headset, you don’t use the headband, but you attach the sensor directly to the rear part of your device, and then you fit it to your head, making sure that the NextMind device sits on your nape.

That’s it for the hardware side: you have basically just to make sure that the device sits on your nape and the sensors touch your skin.

For the software side, you can download on its website an installer that configures everything. It also installs a hub with some pre-built demos and a utility that helps you in your first installation, guiding you step by step. The software installation more or less just requires you to connect the NextMind sensor to your computer via BlueTooth, which is the only channel of communication of this device.

Connecting the device via Bluetooth and verifying that all sensors are reading realiable data (Image by NextMind)

When you’re set to go, you have to do a final step: the calibration. You have to launch an application, that first of all, verifies that the sensor is reading reliable data from your brain (if not, you’re requested to slightly move it, because maybe it is not touching the skin but your hair) and then lets you see a pulsating circle for 40 seconds so that the runtime can calibrate its detection algorithm to your particular brain responses (more on this in the “How does it work” chapter). When the calibration is over, you are awarded a score from 1 to 5. The higher the score, the better the data sensing of the runtime. If it is above 3, the results are acceptable and you can start using your brain superpowers. Notice that if you move your sensor too much, you have to re-calibrate it, so it’s better to keep it still on your head.

Once you’ve done all of this (a matter of some minutes), you are ready to go!

Comfort

wearing nextmind
Wearing the NextMind device (Image by NextMind)

Comfort is what I liked the least of NextMind. The official headband has some velcro that lets you modify its tightness, but the problem is that to have better brain reading, and to avoid it from moving, which would invalidate all the calibration, it is better for it to be tighter than looser. But the EEG sensors have pretty “sharp” plastic edges, and they must touch the skin of your nape (so your hair can’t act as a cushion). If you tighten too much the straps, you can clearly feel them on your skull, and sometimes they can also cause a bit of pain.

When you use it with a headset, there is the same issue, plus the fact that it goes to change how you fit the headset to your head. To make the sensor lay on your nape, you put the rear part of the device a bit below the usual, and so the front part (the headset box) can go a bit upwards, leaving a bigger nose gap. Not to mention the fact that it doesn’t attach in a stable way to the rear part of the headset, so when wearing it you must be careful of not making the sensor slip away. This is all that has happened with my tests with the Quest 2.

So even if the device is very lightweight, I haven’t had a very comfortable experience wearing it. Nothing dramatic, but I think this device should improve on the comfort side.

nextmind vr review
This is how the NextMind sensor sits on the back of a VR headset (Image by NextMind)

How does it work?

NextMind understands what you’re looking at by analyzing your brain response to flashing patterns.

Here the company explains to you how the sensor works

When you’re focusing your attention on an object (so it is on your fovea), your eyes send the information of what they’re seeing to your brain. The brain processes this information in various areas, one of them being the “visual cortex”. It has been noticed that the visual cortex has identifiable reactions to different kinds of visual flashing patterns and this may be used to detect what the user is looking at. That’s why the NexMind sensor is on the back of your head: to be closer to the visual cortex and be able to read the brainwaves it emits. Its nine EEG sensors have been conceived so that to read the brainwaves that gets emitted from your brain with the highest quality possible. The company has put a lot of effort on it.

So, let me tell you an over-simplification with completely made-up numbers so that to convey the concept of how this works (neurologists, please don’t cry reading this). Let’s suppose I show to your brain a pulsating pattern of lights that we call “A”, for instance where the light is on for one second, and off for one second in loop, and reading with an EEG the brainwaves that comes out from your visual cortex, I see that you have a wave of 1Hz. Then I show you a pattern “B”, that has the lights on and off for half a second, and I detect a brainwave of 2Hz. Then I go on with “C” and I get 3Hz, and so on. After all this study, a random light pattern appears, and I don’t see it. But from the brainwave sensor, I can read that your brain is reacting with a response wave of 2Hz. What is the light pattern that was on? Of course, it was the “B” one, because it was the one giving me 2Hz during the tests.

Calibration sequence of NextMind. Notice how it makes you look at some circles that flash in a weird way. Those are the flashing patterns used for the detection of objects (Image by NextMind)

This is exactly how NextMind works. During the 40-second calibration stage, it shows you 12 flashing patterns you have to focus on. While you watch them, the system records what are the brainwaves that you emit while watching them. When your calibration is over and you start an application using the NextMind SDK, the system constantly reads your brainwaves, and when it detects a signal that is similar to one of those identified during the calibration, it understands that you’re looking exactly at that pattern that was shown you during the calibration, and so it understands what you’re looking at it. You can so select that object just with the power of your mind.

The calibration is necessary to understand what is your peculiar response to the flashing patterns presented. This is a very subjective thing, that’s why it must be performed for every user. And not only that, it must be performed for every user, and for the particular position the sensor is on his/her head. That’s why if you remove the device, you have to calibrate again: maybe in the new position, the brainwaves read by NextMind are different from the ones in the previous position.

As you may have understood, this means that the system can’t really understand what you’re looking at. It doesn’t know if you are looking at a chair or the face of Johnny Knoxville (eh, the great times of Jackass). It just knows that you’re looking at certain flashing patterns. If there are no flashing patterns in your apps, or better, if there are no flashing patterns upon which the runtime has been calibrated, NextMind sensor can’t understand anything. This of course removes generalization to its use but increases a lot the reliability. Again, “only one thing, but made well”.

Performances

nextmind sensor review
The EEG sensors installed on the device. NextMind claims it has been able to create a type that is very realiable, and it is one of the biggest innovations of this company

After my hands-on session, I can confirm to you that NextMind can detect the item I’m looking at with great accuracy. Most of the time, it has been able to detect the item I was looking at, even if it was surrounded by other flashing items. This means that it can reliably detect what is the object you’re focusing on, even in a cluttered scene. This is a notable result and shows the high quality of this product.

But… there is also a drawback, that is the latency. I think that the system wants to be sure that you are exactly looking at that element before triggering it and to do this, it applies a hard filter on the data. This means that from when you start looking at an object to when it gets activated, in the best case you have to wait 1 second, if not 2-3. The interactions with NextMind become very slow, and this means that it can’t be used in contexts where speed is fundamental (e.g. action games). There’s no free lunch: if you want accuracy, you have to lose speed.

Battery

nextmind sensor battery
The battery duration is for sure satisfying (Image by NextMind)

NextMind sensor can be charged via a USB-C connector. You need 2 hours to have a full charge, which then lasts for around 8h of usage according to specifications. It manages the battery in a very smart way, and in fact, it goes on standby after some seconds of inactivity. This also means that it is better to check if it is still on before putting it on your head.

(Here you are my funny story: after I had experimented with 2D apps, I removed the sensor, made some things, and then installed it on my VR headset. I wore it, but I couldn’t make the VR demos work, so I thought the sensor wasn’t touching my skin enough, so I continued tightening the head strap of the Quest 2 with still no luck. In the end, the headband was so tight that the sensors were nailed inside my skull and hurt me, and still, nothing worked. At that point, the pain made me realize that maybe the sensor was turned off. Damn battery management…)

NextMind vs Eye Tracking

At this point of the article, you may ask yourself: but if NextMind lets you just select objects with your eyes, why don’t we simply use an eye-tracking device? Well, the Big Bang Theory guys would tell you “because we can”.

As a techie, I recognize myself a lot in this clip…

Jokes apart, yes, to do the selection task, especially in VR, an eye-tracking device would be cheaper (Droolon F1 by 7Invensun is only $150), more reliable (because it tracks directly the eye), and also more versatile (it can track all the objects in the scene, not only the ones flashing).

But you should still consider NextMind as an interesting gadget because:

  • It can work also without VR. This is true also for some eye-trackers, though;
  • It is installed in a region of your head where it can give an inferior risk of coronavirus contagion (it is so safer for exhibitions);
  • It could be employed in some healthcare studies, where you want to analyze the cognitive capabilities of the user;
  • It is a gadget that lets you start experimenting with brain-computer interfaces. BCI is one of the big trends of the future, and understanding what they can do and what not is important for many IT professionals;
  • I guess that in the future its runtime could be updated to do more. The sensors read all the brainwave data, and at the moment this data is abstracted to just give you an index of a selection. A future runtime could offer more functionalities, a bit like it has happened with Leap Motion that in 2012 was a rough accessory and now is an amazing hands tracking device.

Hands-on with the demos

NextMind comes with many 2D demos available: a breakout game, a platformer, a pin pad, a music creator, etc… all with an interface powered by your brain.

Playing with them, I have been able to notice the pros and the cons described above: the software is very reliable, but sometimes it is a bit slow. This was very noticeable especially in the games: the breakout game (who remembers the mighty Arkanoid?) was a pain to play because I had to guess 3 seconds before where the ball could go, and stare at that point hoping that the paddle could move in time to save the ball (and usually this either didn’t happen or happened at the last instant). With the platformer it was the same: killing an enemy with the gaze is a true pain. Things were better with UI interfaces (like the one to create music), where I didn’t have a time constraint and was just relaxing.

The breakout also revealed how the detection is stable, though. In that game, the blocks among which you can choose are all close one to the other, but I very rarely had a misdetection.

Playing Breakout with NextMind is ultra-frustrating

Another issue I noticed in my tests has been the so-called Mida’s Touch problem. For instance, while I was trying a demo, I mentally took a pause and started wandering with my eyes, but the device kept tracking me and kept considering my gaze as an input, activating random things. This should be avoided when designing a gaze-powered UX. It is normal for us to use our eyes just to wander around, and in my opinion, they shouldn’t be used for triggering important things, for which a confirmation through a hard trigger (e.g. a mouse click) should be mandatory.

The best demo app, which I think is the one in which a device like NextMind shines, is a page where you have the 3d model of NextMind plus some action points, and when you hover over them with your eyes, you see more information about the device. This is one of the ideal scenarios because you just use the gaze to have more contextual info, exactly as you do in your everyday life. Plus, you don’t have any hurry.

Killing some aliens with the VR game Mindvasion

The final demo that I’ve tried has been Mindvasion, a VR action game where you can shoot aliens by looking at their pulsating brains. This game showed me how it is possible to have pulsating patterns that are different from the flashing gray patterns that were used in the 2D apps, and that it is also possible to “hide” the calibration in the game saying to the user that it is a “training”. While he/she practices shooting during the tutorial, you can calibrate the device for the user and the flashing patterns. Mindvasion is a very simple game, and it is also very forgiving. The enemies are very slow in shooting, and they stop in a position so that you can look better at their heads and kill them with all the long times required by the sensor. It is a nice experiment, but it confirmed once more that this is not an ideal sensor for gaming.

Unity SDK

The Unity SDK for NextMind is just fantastic. It comes with many ready-out-of-the-box prefabs, and sample scenes, both for VR and 2D. It has many facilities, and the one that I loved the most is the simulator: when debugging your application, you don’t need to use a real sensor, but you can just use your mouse, and simulate your gaze with the click of the mouse (the more you keep the left button down, the more you are virtually looking at that object).

Implementing mind-controlling in a Unity app just requires you to add a NeuroManager prefab to the scene, and then a NeuroTag script to every object that you want to track. The NeuroTag will take care of making the object to flash so that it can be recognized when you look at it. Registering to the events of the NeuroTag (e.g. object triggered) you can then implement all the logic that you want. It is incredibly easy to do your application, and I needed just 2 minutes to develop my usual Unity Cube that becomes bigger if you look at it and then returns to its original size if you don’t look at it anymore.

You can have a maximum of 10 active tags in the scene at the same time. And no, you don’t have access to low-level brainwave data.

Price and availability

Nextmind is available in many worldwide regions through the shop on its website. The price is $400 + shipping.

Final verdict

nextmind review music
NextMind is a cool gadget (Image by NextMind)

Brain-Computer technologies are still in the early stages, and it is not possible to expect that nowadays a device can work like the NerveGear of Sword Art Online. For where we are now, NextMind is a very good device. The company has decided to provide only one feature, the one of selecting objects with your eyes, and doing it very well. And they have fulfilled their promise: the accuracy of the device is great, and so is its usability (the setup is very fast, the learning curve is very short) and the features offered to the developers.

The only problem is that it is a bit slow to perform the detection, and so it is advised not to use it in applications where time is a critical resource.

If you want to start experimenting with a brain-computer device that has high quality and it is easy to be used, this is the right device for you (provided that you have the money). Notice that I still see it as an experimental device: if you need a product to track the gaze of the user in a professional setting, I would still advice to buy an eye-tracking accessory. NextMind is more if you want to experiment with the technologies of the future.


And that’s it for my review of NextMind, that I hope has been useful to explain to you if this gadget suits your needs! If you have any further questions about this device, feel free to ask them here below in the comments or just contact me on my social media profiles!

And subscribe to my newsletter to not lose my next amazing articles…

(Header image by NextMind)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]