7invensun eye tracking virtual reality hololens focus

7Invensun will provide eye-tracking for many worldwide XR devices

I have many sweet memories of my trip to China. One of the best of them is when in Beijing I visited the company 7Invensun, a worldwide excellence in eye tracking. It has been a great moment both from a personal and professional side: as a person, I was delighted by their kind attitude: they really made me feel as a friend of theirs; as a VR professional, I was impressed by their eye tracking technology and the various devices they have down the line, that for instance will give eye tracking to the Vive Focus and HoloLens. Let me tell you the whole story.

For sure you remember the name 7Invensun: I have already interviewed them when they announced that their eye tracking module aGlass DK II was able to provide eye tracking not only for the Vive, but even for the Vive Pro. So, when I decided to go to Beijing, I wanted absolutely to go visit them to try it with my eyes.

The company is located inside a skyscraper in Beijing and has a very nice office. When I arrived there, I was overwhelmed by their kindness. I have spoken with them sometimes on Wechat, but they received me and my Chinese assistant Miss S as I were their best friends. I started shaking hands with people I had only spoken to virtually, like Lee and Kristina and it was really nice seeing them in person.

7invensun eye tracking virtual reality hololens focus
View of Beijing from 7Invensun offices

After that, I started trying 7Invensun devices. We started with the aSee Binocular Eye Tracker, an eye tracking device for desktop PC. PC was the first target market for this company, that started working on eye tracking in 2009 (so even before Tobii). The first goal was helping disabled people in using electronic devices, but then they realized the many other amazing applications of eye tracking and so expanded their offering. A 7Invensun employee (I don’t remember the name of that girl, I feel so sorry) started showing me various photos on the screen of the PC and the only thing that I had to do was looking at them. (Of course, knowing that my eyes were tracked, I purposely decided to avoid looking at compromising details of the various photos :D) After this little presentation, she showed me the heat map of what I looked at, showing what were the images I looked at the most and what were the details of the images in which I was mostly interested into. Then she opened an Excel file and I could clearly see there all the data regarding my eyes movements while looking at the various photos.

7invensun eye tracking virtual reality hololens focus
Trying the PC eye tracking solution. It is that black device below the monitor (Image by 7Invensun)

I was impressed: the program recorded everything I looked at and this has amazing applications, as the girl confirmed: for instance, this can be used in psychology (seeing what kind of images arise interest in you can help in discovering psychological issues, for instance) but would be massive in UX design. Imagine if you were a website/app designer and you could see the journey of your users’ eyes over your page. You could discover where they are instinctively looking for information first, what are the regions that they look the most and re-organize the page according to this precious data so that it becomes more usable and effective.
She told me that actually lots of different professionals are using eye tracking applications: for instance, it is also used by architects to analyze what parts of a building attracts the users’ attention or by trainers to see if the trainees are actually paying attention to what they say. 7Invensun is working with all these figures to exploit the power of eye tracking, that can disrupt various sectors.

7invensun eye tracking virtual reality hololens focus
Heatmap of the regions of the screen I looked at the most. I don’t remember what was there, but I hope it was not something compromising me (Image by 7Invensun)

After this little PC demo, we switched immediately to VR. Lee handed me a Vive Pro with an aGlass DKII installed inside and I was ready for the party. Before actually using it, I had to perform a little calibration stage, that is necessary to adjust tracking parameters to my particular eye configuration. I had basically to adjust the IPD of the headset mechanically to fit my eyes (it would be cool if the Vive Pro would be able to adjust IPD automatically depending on what the aGlass device detects) and then look at some points popping up in different positions on the screen. It was very fast, it took less than a minute and it was necessary only once per session. After that, I have been able to try various demos.

The first one was about foveated rendering: they activated foveated rendering and showed me that it worked. To me is a bit strange evaluating the performance of this demo, since they had not an on/off switch on the foveation, so I wasn’t able to tell if I couldn’t notice any difference with and without the foveated rendering. So I can’t guarantee that there was absolutely no graphical difference than using standard rendering. For sure, with foveation the visuals were great and I wasn’t able to spot that the device was changing following my eyes and downgrading the regions I was not looking at. So, it worked well. Then they also made me see foveation on NVIDIA VR Fun House. With that demo, I noticed that with very aggressive settings on foveation (so, the areas that you are not looking at gets downsampled a lot), the difference is noticeable, so I learned that foveated rendering parameters have to be calibrated well, otherwise the trick is noticeable. They also slowed the eye tracking and so I was able to see how is foveating rendering when the software lags following your eyes… and it is a trippy experience where you see a high-resolution window moving inside your vision 🙂

I think that foveated rendering will be a fundamental evolution for virtual reality, because it will relieve the work of the graphics card and this will mean that from one side VR developers will be able to deliver experiences with a better graphical outcome and from the other side that even people with non powerful graphical cards will be able to use virtual reality headsets. I really can’t wait for it to become widespread.

Another VR demo that I tried let me interact with a fantasy world just by using my eyes. I could move inside VR using only my eyes: I could look at a particular position in the world and then teleport there. And then I found myself in front of a table with three objects on it and just by looking at them, I was able to see further info about them. I felt a bit like Terminator, that was able to see information about the objects and people he looked at. I had super vision. Another one let me be in a plane and shoot at enemy planes that came towards me just by looking at them. It was fun. I think that using eyes to interact with stuff can not only help us in having less hands and neck fatigue while using VR apps, but can also help disabled people in using virtual reality experiences.

The last demo was one about analytics: I found myself in a virtual supermarket, and I was able to buy stuff in VR just by picking items naturally. In the end, I could go to the cash desk of the supermarket and pay for what I bought. After I did my shopping, 7Invensun employee pressed a key and so in VR, I could see the 3D world around me becoming a heat map of what I looked at. The world was white where I had never looked at, and instead showed a color ranging from green to red for points I looked at, depending on the time I stared at them.

These analytics data can have two important applications:

  1. See what kind of products people are mostly interested in;
  2. See where people mainly look for products, so that to adjust the supermarket structure.

These are precious data for all the retail and e-commerce firms. In fact, this demo has been developed together with JD.com, that is one of the most important Chinese e-commerce websites.

7invensun eye tracking virtual reality hololens focus
Me playing with eye tracking in VR (Image by 7Invensun)

I loved all of their demos. But as always, I have also some concerns I want to tell you:

  • First of all, privacy. Eye tracking is awesome, but it gives companies like Facebook the power to discover everything we look at. This is frightening because at the moment companies can mostly track what we voluntarily do (like putting a like or sharing something), while with eye tracking they could discover also what we instinctively look at, what we are unconsciously interested into. I am sincerely afraid of this and so I hope that there will be a regulation regarding the use of eye-tracking data;
  • The second one is about tracking accuracy. The tracking technology worked very well while I looked in front of me, but tracking precision degraded as I moved my eyes to look too much towards left, right, up and down;
  • The third is about using the interfaces. When I had to use my eyes to perform some actions (like looking at a point to teleport there), I found it very comfortable and easy, but at the same time a bit strange, since I don’t usually use my eyes to perform actions. In my real life, I use eyes to inspect things, not to operate on them. I found more naturals the experiences that let me use my eyes in the normal passive way. This taught me that at first, we should focus on using eyes in VR in a natural way and then maybe slowly move to use them to interact with stuff. I think we all will have to learn to use eyes to do more things than the ones that we are used to doing now;
  • The fourth thing is about fatigue. I tried to do an extreme test and didn’t move my head and just moved my eyes to shoot at the enemies in the action game. The result is that my eyes felt really tired of having to continuously move to shoot at stuff. So I learned that using eye tracking doesn’t mean not moving the head: the more comfortable thing is to move both in a natural way.

Anyway, I was satisfied with the tests and also Miss S, that is not a techie, liked it and had no issues in learning how to use it. That’s great and means that using the eyes is so natural that even general consumers do not need a tutorial. After the tests, I also realized that eye tracking in VR is not consumer ready yet, for the above reasons. On the hardware side, we need devices that work every time with great precision and with almost no calibration; and on the software side, we need programs with a proper UX for using the eyes. That’s whey the aGlass is still a dev-kit. A very cool dev-kit, IMHO, but still a dev-kit.

After all these demos, we all went to have lunch together and I ate a lot of delicious Chinese food. After that, I met the Company CEO, Mr. Huang Tongbing and we all did a meeting to talk about my experience with the device and the future plans of the company. Miss S has helped a lot during this stage in translating stuff from Chinese to Italian, so I have to thank her a lot.

7invensun eye tracking virtual reality hololens focus
Me preparing for the meeting in the afternoon. I couldn’t image it would have been that interesting

The first question was about their future plans: such a cool device should be available all over the world and shouldn’t be relegated to the only Chinese Market. Mr. Huang told me that 7Invensun’s aGlass devices will go worldwide. At first, they will be launched very soon in countries that are nearby China and then, with time, they will try to conquer the whole planet.

I asked him about the problems that I had during the demos and he said that they are working hard in improving the tracking and also in offering a proper UX that exploits the eye tracking. They are also working in making the aGlass more comfortable for the user. Regarding my issues in having less precision of tracking when I looked far from the frontal direction, he explained that this is a technical issue. When the eye looks forward, the system can clearly see the iris and the pupil having a circular shape, but when you look too much left or right, for instance, part of the iris becomes occluded by the eyelid and so it has a different shape in the images shot by the internal cameras. This makes the tracking more difficult. They have an enormous database of eye images and are training the AI to solve this issue. He also added that since the headsets’ FOV is currently limited, people tend to not move the eyes that much, so the most important thing is having a better tracking in the central region of vision.

7invensun eye tracking virtual reality hololens focus
Me, Mr.Huang, Lee and Kristina talking about eye tracking in virtual reality

In his opinion, eye tracking in VR will still need 2-3 years to become widespread and to be used properly by all the VR experiences and devices. So the technology is on the right track, but it is not ready yet. Regarding the future of VR in general, he said that virtual reality needs lighter headsets with higher resolution and improved comfort to become successful. And it also needs a better UX, something that eye tracking can help with.

Regarding the future, I asked for standalone headsets. I love the Vive Focus, and I would love to see eye tracking on it. I thought he would answer that he couldn’t tell anything, while actually, he showcased me everything in preview. I’ve seen it: there is an aGlass device also for the Vive Focus. It is not ready yet and the Focus it was installed on was clearly hacked to fit it in. But there was it. I can’t show you the pictures, but it was basically a Vive Focus with two eye tracking inserts around the lenses and the USB C cable of the eye tracking addon that went into the USB C port of the Focus. Some screws around the power-on button showed that the device had been hacked to use it with eye tracking.

I wasn’t able to try it, but they let me use the eye tracking module installed on a similar Qualcomm Snapdragon 835 VR Dev-Kit headset. It was there and it worked. Performances were a little worse than the one working on PC in the sense that the problem of tracking precision becoming bad while looking at the external region of my vision was more noticeable. But I was amazed nonetheless: eye tracking on standalone can be really disruptive: along with all the above reasons, think about the performance gain that foveated rendering can give on standalone devices. These limited power headsets could be able to run games with great graphical quality, improving a lot the user experience. My mind started thinking about Robo Recall on a mobile headset, then I remembered being in a Vive-X company and so avoided citing Oculus 😀

7invensun eye tracking virtual reality hololens focus
Miss S is looking at me in a puzzled way, but actually, inside VR, I was great. Mobile eye tracking, baby! (Image by 7Invensun)

After this amazing demo, I tried another one: a device called aGlass Holo that can add eye tracking to HoloLens. They made a custom frame that can be installed inside HoloLens to have eye tracking even in the HoloLens v1! Eye tracking will probably be available only in next-gen HoloLens (and is already present in Magic Leap One), so having a solution that can upgrade the 50000 existing Hololens is interesting. This AR solution is customizable, so it can potentially be customized for every AR headset on the market. They made me try it in exclusive preview, saying that it is still a work-in-progress: I was super happy, of course. I have to say that performances were worse than the ones of the other devices that I tried (it’s just a prototype, it is normal), but when it worked, it was great. Wow, eye tracking in AR!

After all these awesome demos, I had to left them. I was quite sad about leaving, but they gave me the last smile by giving me as a gift an aGlass DK II, to show how greatful they were of my visit in Beijing and to let me start experimenting with eye tracking! I really want to thank Mr. Huang, Kristina, Lee and all the other 7Invensun employees for the wonderful time I had there.

And in these days, I’m actually experimenting with it and I will keep you updated on this blog regarding what I’m doing with it. Of course, expect also an unboxing and a review of this cool device! After using it for various days, I have a better picture of its pros and its cons, so stay tuned!

7invensun eye tracking virtual reality hololens focus
I wouldn’t have expected to play with eye tracking inside Hololens. It was special. (Image by 7Invensun)

I think that the devices offered by this company are really interesting. 7Invensun’s people are looking at companies and developers interested in collaborating with them and in using their products. If you are interested in a dev-kit, feel free to email them saying that you got to know them through my website. Or if you need to talk about partnerships, feel free to ask me for an introduction, I will be happy to help.

After my previous post, they even shipped some free aGlass devkits to people wanting to do cool experiment with their products, so if you have an HTC Vive or HTC Vive Pro and a great idea in mind that exploits eye tracking, shoot them an e-mail 🙂 . (Just to clarify: they don’t give free kits to everyone, but if you have some awesome project that gets approved by them, they can send you a free kit.) I’m sure that eye tracking is the future of VR, so being able to experiment with it from now, is very important for us VR developers.

And for me is very important that you share this post on your social media channels and that you subscribe to my newsletter… so don’t you mind doing these two little actions? Thanks 😉

(Header image by 7Invensun)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

7 thoughts on “7Invensun will provide eye-tracking for many worldwide XR devices

  1. Very interesting article and your findings in China👍

    I’ve played with gaze based eye tracking, its cool but also very strange using your eyes to interact.

    You are very “on the money” about foveated rendering driven by eye tracking, its truly the key to unlocking VR for mass consumption by dramatically reducing the compute required to run a headset…not everyone can afford a GTX1080ti

  2. Definitely there’s tons of potential for eye-tracking in VR for the reasons we already know, but also in several other areas as you said; studying how different packaging alternatives affects consumers perception of a product in Marketing, study learners levels of attention and general behaviors in Training and Education, getting patient’s eye parameters quickly for Medicine and maybe use tracking to carry out some study (I guess that an ophthalmologist will be quite happy with having easy access to all those parameters inferred by an eye-tracking device), analytics in Design for web, generic software, games, VR apps or whatever, and many other you have already mentioned. Really cool and useful tech for sure!

    Anyway, yeah it’s still quite far from being consumer ready and I proved it myself few weeks ago when 7Invensun send me the price for a basic eye-tracking kit for the Vive. My face was like: ಠ_ಠ It was something crazy like $2.000 (could almost buy an whole eye-tracked Tragic Leap with that money! It even has whales now). Although now that I think it twice I’m not really sure if they really get that I wanted the very most basic kit… but they didn’t answer anymore.

    p.s.: we finally met Miss S 😛

    1. About the price I asked for further clarification and also gave them some suggestions, since they are still deciding how to market it in the West.

      I think that $2000 for an enterprise package with business licensing and additional services may be ok. $2000 for a devkit is a bit too much. Then it also depends if they give you the basic accessory or they bundle other stuff (like a Vive headset).

      I’ll keep you updated with their answers.

    2. In the end they answered me that were 2000 RMB (that is, circa $300) that is a reasonable price. The problem is that it is an indicative value for the developer edition, for enterprises it costs much more…

  3. Hey Tony any idea if the jd.com supermarket is shown / available online anywhere? Would love to see it we have made something similar

Comments are closed.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]