eye tracking 7invensun aglass htc vive virtual reality

7Invensun aGlass DK II unboxing, setup and review: a great dev kit for eye tracking in VR

In China I met the company 7Invensun, that is a worldwide leader for what concerns eye tracking. Being them very kind, they gave me as a gift an aGlass DK II eye tracking add on for HTC Vive and I used it to experiment with eye tracking interfaces. After some days of usage, I think I can do a review of this interesting device, so you may discover its pros and its cons. So, keep reading to know everything about the aGlass DK II!

Unboxing

The device comes with a black box with some Chinese writings in it and a big “aGlass” box caption on top of it, with the G that has a shape that reminds a lot the shape of the eye tracking modules. Opening it you’ll find everything that you need to empower your HTC Vive headset giving it eye-tracking capabilities: two eye-tracking modules, some lenses to use the device if you have eye problems (so that you can avoid wearing glasses), a cloth to clean the lenses, the hub to connect the eye tracking modules to the USB port on the Vive and the instructions printed on paper. In the box, there is not much, because you don’t need lots of stuff to add eye tracking features to your VR headset.

eye tracking 7invensun aglass htc vive virtual reality
aGlass DK II box on top of HTC Vive one

I always comment about the beauty of packaging and this time I won’t do any exception: the items are packaged very well and all the sponge makes them stay safe, but I can’t tell that it is a delight for the eyes. This is still a dev kit, so it is ok. But for the consumer version, I hope they will invest something more to make the arrangement inside of the box more fascinating.

The device

As 7Invensun explained me, each eye tracking module is composed by a ring of IR emitters that light the eye (in the dark of the internals of the headset) and an IR camera that sees the eye as lighted by the emitters. The camera shoots images at high frequency and then sends them to the PC via USB, thanks to the USB port that is present on the front panel of the HTC Vive. On the PC runs a service that gets all these images and returns the position of both eyes, so that you can use it in VR.

eye tracking 7invensun aglass htc vive virtual reality
The eye tracking module for the left eye installed inside my HTC Vive: all those tiny white points that you see there, are the IR emitters, while that black circle in the lower part of the image is the IR camera

If you want some specifications, here you are:

  • Accuracy: < 0.5°
  • FPS: 100Hz/120Hz
  • Delay: < 5ms
  • Communication Port: USB2.0/3.0
  • FOV: >110° (adapted to HTC Vive)
eye tracking 7invensun aglass htc vive virtual reality
When photographed properly, you can see the IR emitters actually casting light
Setup

Inside the box, you can find the instructions on how to assemble the device, both in English and in Chinese. The hardware setup is really easy. The only difficulty I had was actually in inserting the two modules inside the internal space of the Vive, that is very tight, but once I get that the solution is making the two lenses the closest possible by moving the IPD knob, everything went very smoothly.

The setup is just basically attaching the two eye-tracking modules on the lenses and then connecting them to the USB port present on the Vive. If you need an assembly tutorial video… well, here I am to help you!

After the hardware assembly, you just connect the Vive headset to your PC as always. After that, you go to 7Invensun website and download the aGlass DK II runtime from the downloads page. If you plan to develop, download the SDK v2 archive (that contains the runtime, too), otherwise go for the runtime v3. I’ve provided you the direct links so to make your life easier, aren’t I the best VR blogger ever? 😀 😀 😀 . Unzip the archive, launch the setup file, follow all the instructions to install 7Invensun aGlass runtime.

At the end of the installation, the system will suggest you to start the calibration from the aGlass menu. As I’ve learned when I interviewed 7Invensun for the first time, calibration is currently a necessary stage for all eye tracking devices. The system has to understand the peculiarities of your eyes and so before using it, you have to look at some predefined points, so that the system learns how you do look up, right, down, left and so on. If you don’t calibrate, you can still use the system, but the accuracy won’t be optimal. Eye tracking companies are working on tricks to reduce calibration time or eliminating it at all because it introduces friction, but at the moment you have to perform it.

eye tracking 7invensun aglass htc vive virtual reality
My HTC Vive is now able to track my eyes! Wow!

Calibration is easy and it is divided into three steps:

  1. Adjust the IPD of your headset so that it matches your eyes. The eye tracking system can track your eyes so it can detect your IPD. Unluckily, Vive devices have not an automated IPD modification system, so you have to change it by hand. Follow the instructions on the screen and rotate your IPD until the system tells that it is ok;
  2. In phase two you have to move your headset so to center physically your eyes within lenses. The system shows how you have to move the headset so that with your hands you can adjust it. I have to say that this is the trickiest phase because I actually never get how I do have to properly move the headset to fit my eyes and when the system says that it is ok, the headset is actually not resting on my face anymore, but it is a bit distant. The company is working on fixing this little issue… and anyway pressing the SPACE key on the keyboard is already possible to skip this part;
  3. Look at the points that the system will show you on the screen. The system will show you a short sequence of little points on the HMD screen and you have to look at them until they vanish away. These points are strategically positioned so the system learns how you look in the various positions (up, down, left, right).

Apart from step 2, that has some little issues, the calibration is really easy and fast to be performed: in 30-60 seconds you’re done with it. After that, you just see a final screen that lets you see if the calibration has been successful. You have some points and looking at them, they become highlighted. If at this stage, you see that the precision is bad, you can retrigger calibration. Otherwise, press the Menu button on your Vive and exit. You have successfully setup your eye-tracking device!

After that, you will be able to see the aGlass tray icon in your taskbar. Right-clicking on it, you will be triggered with various options. One that I want to highlight is the possibility to re-trigger calibration or also set the current user. The system allows you to set the current user, so that you can calibrate the device for different users on the same PC and then just by using this options menu, you can load the calibration parameters for each user, so you don’t have to calibrate the device again, even if different users use it. Cool, isn’t it?

eye tracking 7invensun aglass htc vive virtual reality
The menu that you can access through the taskbar: as you can see it offers various facilities regarding calibration
Comfort

I was afraid that the eye-tracking inserts could make the comfort of the Vive worse, but this is mostly false. You just feel some little discomfort in the zone of the nose and in the zone between the nose and the eyes, because now in that zone there is the hard plastic of the eye tracker and not only the rubber of the headset. I have to say that this is not a big issue at all and I’ve got used to it. Of course in future iterations of the device, this should be improved.

eye tracking 7invensun aglass htc vive virtual reality
I’m an eye tracking pirate!

The fact that aGlass DK II features additional lenses for eye problems (presbyopia, myopia) that you can put on the modules so that you can use eye-tracked Virtual Reality without wearing glasses is a nice choice by the company, that improves the comfort of using the device.

Runtime performances

If you want to test the device, you can get some demos from the aGlass downloads page. Otherwise, you can code something using Unity and do some tests yourself.

I’d define the tracking as pretty accurate, but as I’ve noticed during my visit to 7Invensun HQ in Beijing, it is still not perfect. Especially when you look too much up, down, left and right, the tracking accuracy degrades a bit. When you are mostly in the central region of your vision, instead, the tracking is very accurate. This can sometimes be frustrating a bit because when in a VR experience you use the eye to select objects, it may happen that 5-10% of the time, you look at an object and it doesn’t get selected because the eye cursor is in a slightly different position than the one you would expect and so the selection mechanism doesn’t work. Regarding speed, instead, there is no problem at all and the system is very reactive to eye movements.

I guess that you may be curious of a comparison with Tobii, but unluckily I have never tried Tobii tracking technology in VR, so I can’t tell.

I recall what Abrash said years ago, that is that a mouse should work 100% of the time, not 90% of times otherwise users get frustrated and I have to say that with eye tracking we are still in the latter situation… and that’s why we have still not eye tracking installed on all headsets. But I think that it can be already used to do amazing research regarding where the user is looking at and to experiment with eye-tracked UX, as we are doing right now. So, it is dev-ready but not consumer-ready.

Unity SDK

The aGlass SDK contains plugins for Unity and Unreal. Using the Unity SDK is rather easy: you have just to check if the runtime can track the eyes and then ask for eye position and then use this eye position to do something. The SDK contains also a little sample to better understand how to use it. I learned how to use it pretty fast. To get the eye ray for instance you just need these few lines of code:


        //check if eye tracking is valid at this frame
        IsEyeTrackingValid = aGlass.Instance.GetEyeValid();

        //if it is
        if (IsEyeTrackingValid)
        {
            //get current gazed 2D point
            Gaze2DPoint = aGlass.Instance.GetGazePoint().ToVector2();
         
            //transform it in a ray
            GazeRay = Camera.main.ScreenPointToRay(Camera.main.ViewportToScreenPoint(new Vector3(Gaze2DPoint.x, 1 - Gaze2DPoint.y, 0)));            

            //do something with the ray of the eyes
        }

In my opinion, the problem of the SDK is not what is there (that works and works well), but what is not there. There are no facilities (e.g. to select objects by just using your eyes, etc…), there are not many samples, there are no additional services. At the moment it is just a barebone SDK that gives you the minimum necessary features. I think that it can be improved to offer more features and make the life of us developers easy. Again, maybe this will be offered with time, when the device will approach the consumer version.

eye tracking 7invensun aglass htc vive virtual reality
A little analytics prototype that I made with the aGlass DK II inside Unity. It was fun and took me quite a little time to code it

While developing I also noticed that sometimes the runtime crashes and it has to be restarted or that Windows show some weird pop-ups about a disk it can’t access. Nothing terrible, but a thing you must be aware of. The company has told me that some of these bugs have already been fixed in version 3 of the runtime.

Final judgment

I love the aGlass DK II. I’m doing a lot of experiments with it and I’m now more convinced than ever that eye tracking will be disruptive for VR. Personally, I’m using it to experiment with eye tracking UX with my partner Max at New Technology Walkers and we are doing with it a lot of interesting things like this one that you can see in the video. In a future post I’ll detail you all these experiments, so stay tuned to learn more about eye tracked VR!

But if you get aGlass, you have to understand what it is: those two letters in the name, D and K, are there for a reason: this is a dev kit and not a finished product. Being a dev kit, it can show some little accuracy issues, it can have occasional crashes, it can have basic features in the runtime and in the SDK. As a developer I already experienced all of this with other products: Oculus Rift DK2 headset itself made Unity crash constantly on my laptop and sometimes froze it completely.

I think that if you are a developer, or a customer wanting to use it for an experimental product, you will like it. It is something that is worth investing in because eye tracking will be disruptive for virtual reality (for analytics, foveated rendering, new UX, support for disabled people, etc…) and that’s why I’m hyped about it. If you instead want something that works flawlessly all the times, you had better wait some time before entering the eye tracking realm.

Regarding the price, it is still undisclosed, and at this moment it may also depend on the use you want to make with it (if you are a dev, an academic institution, etc…). So, if you want to use it, you have to contact directly 7Invensun. They are kind people, so don’t be afraid of sending them an e-mail 🙂


I hope to have satisfied all your curiosity about this device, but if you have further questions please ask me here in the comments section or through my social media channels. And please also subscribe to my newsletter! 🙂


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

11 thoughts on “7Invensun aGlass DK II unboxing, setup and review: a great dev kit for eye tracking in VR

  1. Hey Tony, do you have the chance to ask them why they give me an extremely exaggerated price for just the dev kit? It was something like $2000+ (yeah, pretty crazy numbers haha) so obviously I told them that that price tag was far beyond my financial possibilities as a developer so I would prefer to wait a bit…

    1. I have answered that in another comment. Someone from the company told me that it was 2000 RMB (circa $300), but someone has answered you specifying the wrong currency. Anyway, as I’ve written above, the price also depends on the license type and other factors, that’s why I preferred not specifying them above. The best is contacting them and see what they say.

      1. Oh I may have missed it before. Anyway, I’ve just checked and the currency was USD. “The total pricing for the first three items is USD 2,156.00, and for the four items is USD 3,080.00”. Don’t really know what was the confusion here; I guess that maybe they added things beyond the basic kit, but I requested the very basic one. Who knows…

        1. I know they wrote USD, but my contact says that he doesn’t know what costs $2000… so the person who wrote the answer maybe specified the wrong currency, it’s not an error of yours. I don’t know… would you buy it for $300?

    1. You’re welcome, I’m glad to have satisfied this need of yours!

      It’s a very interesting device, indeed. Sooner or later it should come to the West, too. The company told me that they will expand and won’t only target the Chinese market anymore. So I hope that you will be able to get one!

  2. Thanks for the review. But I am wondering, is the tracking “smooth” for you? I have the aGlass DKII here, but the data tends to be kind of reluctant to follow me smoothly. It is stable (compared to the raw data from Tobii, which can be jittery), but it feels like the position is not updated often enough or maybe the software is discarding small movements. Rather than following my eyes all the time, the gazepoint is sort of “jumping” around. It would be much better to have the actual data rather than trying to interpolate between a few points only. Now I am using the v3 runtime, so I’ll try the v2 to see if that thelps. But I think I see the jumpiness in their demo video too.

    Btw, I wish they could give us a bit more than just the 2D gazepoint :P. I’m pretty sure they can measure the 3D point and pupil diameter at least

    1. Uhm, for the use that I’m currently doing (selection of objects with gaze), I don’t need a super-fast refresh rate, so I haven’t noticed this sort of problems. But I noticed those jumps that you are talking about… I thought they were because my eyes moved fast, but it seems that the reason is the one you mentioned. Maybe they are applying some kind of filters to make data feel more stable.

      The SDK is very basic, I created a facility to reconstruct the 3D ray myself, because the 2D point on the screen is absolutely useless

      1. I understand. Thanks. Though you might want at some point smooth movement or the depth of the gaze even in UX. I might try contacting 7invensun as well somehow for more info.
        Btw I am trying the aGlass to see if we can use it at least internally for development. I used Fove and Tobii Vive as well, but the Fove had other issues (though that was a year ago), and the Tobii is pretty costly 😛 and sending it back and forth to the client is not very convenient… Right now I’m just adding some smoothing and interpolation/extrapolation to try to improve the movement of the aGlass gaze

        1. Ok. I wrote them as well about the issue you mentioned, along with other problems I found. Hopefully they will fix it with time. Thanks for having told me about it!

          Fove was an interesting project but it never took off and Tobii is ultra expensive, I agree with this analysis. 7Invensun has a very cool expertise… let’s hope in what they are doing 🙂

Comments are closed.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]