7 firms with face tracking tech for better VR avatars

Today, most virtual reality avatars are very basic, cartoony figures, with a limited range of motions and expressions.

Vendors are working to change this, and make avatars more realistic, with facial expressions that reflect those of that actual users. The technology that makes it possible involves facial, body and eye tracking.

The goal is to improve social experiences in virtual reality, and make the platforms more immersive and compelling.

One big challenge is that today’s headsets cover up most of the user’s face, so you can’t just point a camera at the user. Instead, developers and manufacturers have to get creative.

Wolfprint 3D

(Image courtesy of Wolfprint3d.)
(Image courtesy Wolfprint 3D.)

Wolfprint 3D has egg-shaped photo booths that use 3D scanning technology to scan people’s faces in high detail and then reconstructs the rest of the body digitally, creating a more realistic 3D avatar. The avatars can then be transferred for use in gaming and other platforms.

Once you enter the booth, you enter your email address and other information and then sit down for a scan. Six cameras on a portable scanner will take a photo of you from three angles and then a software will stitch the image. You will receive an email of a link to your lifelike 3D avatar.

Tim Toke
Timmu Toke

“No measurements of the body are taken at this point,” Wolfprint 3D CEO Timmu Toke told Hypergrid Business.

So far, the company has installed four egg-shaped photo booths on its premises in Tallinn, Estonia and Los Angeles, California, and has already scanned 5,500 people. The company is running a crowdfunding campaign via SeedInvest to install 60 more pods in the United States in malls, airports, cinemas, museums and other public places.

There is no cost to the scans – the company is hoping to get income from third parties willing to use 3D avatars on their gaming platforms and studios and from selling of licensed characters and 3D printed figurines.

“We are working with several gaming and virtual reality companies that use our scans in their experiences to create realistic game characters,” said Toke. “The feedback has been very positive so far.

The company said it cannot disclose details about the use cases since the games are currently under development and have not been released. However, in such a case, a game company can pit your face on a character or may be a filmmaker use your avatar as a character in the movie.

Future plans include investing in backend software that automatically connects the database to these gaming and entertainment industries. They also hope to push the the technology in smartphones. Through the Wolfprint API, Application developers can bring people’s realistic avatars in any application the same way other applications ask us to use our Facebook profiles.

“We plan to use all the existing and upcoming technologies to make our avatars look and feel as realistic as possible,” said Toke.

Watch promotional video below:

https://vimeo.com/184270867%20

Emteq

(Image courtesy Emteq.)
(Image courtesy Emteq.)

Emteq, a U.K-based firm based at the Sussex Innovation Centre of The University of Sussex, has raised $7.7 million for \FaceTeq, a system that can track user’s facial expressions and emotions using sensors on the inside lining of a virtual reality headset.

The sensor reads electrical muscle activity, heart-rate, skin response, eye movement detection and head position 1,000 times per second and then that information is translated into a corresponding facial expression in a 3D environment.

“Integrating our facial interface into a virtual reality headset is simple and unobtrusive, improving the practical form of the device by completely eliminating the need for burdensome face-tracking cameras,” Emteq CEO and serial tech entrepreneur Graeme Cox told VRFocus.

(Image courtesy Emteq.)
(Image courtesy Emteq.)

Users train the system by making facial expressions as directed, and machine learning is then used to process expressions..

The company has a demo up on its Facebook wall where a user can assign the relevant emojis based on his or her emotion, simply by looking at a Facebook Like button on a Facebook post in a virtual reality headset and then winking, smiling or making any other expression. The company is currently working on Oculus Rift and Vive support, and is also considering the Daydream VR platform.

BinaryVR

California-based BinaryVR makes a 3D camera that attaches to a VR headset and looks at the user’s mouth, chin and cheeks.

So far, it can recognize recognize about 20 different facial expressions.

The developer’s kit ships this month and they are accepting pre-orders.

Oben

Oben, an HTC Vive X portfolio company based in Pasadena, combines image and voice to create more realistic avatars on smartphones and then transport their avatars to virtual and augmented reality environments.

The California-based company has already raised $7.7 million from a consortium of investors in a round led by CrestValue Capital, the investment arm of China’s DunAn Group.

(Image courtesy Oben.)
(Image courtesy Oben.)

Production is scheduled to start in be first quarter of 2017.

High Fidelity

(Image courtesy Morph 3D.)
(Image courtesy Morph 3D.)

High Fidelity uses 3D scanners and depth-sensing cameras in conjunction with the high end Oculus Rift and HTC Vive headsets to capture photorealistic renderings, and to track eye gaze, facial expressions and body language.

Users can also create a customized avatar using Morph 3D’s Ready Room and use that avatar on the platform. The Morph 3D Ready Room is an avatar engine and character management system that helps users customize existing avatars using HTC Vive.

https://www.youtube.com/watch?v=zHCI710w9QA

Veeso

(Image courtesy Veeso.)
(Image courtesy Veeso.)

Veeso is running a funding campaign on Indiegogo for  a headset that contains two face-tracking cameras, one pointed at the eyes and the other at the mouth. It also has two infrared sensors, one of which sits between the lenses to track the wearer’s pupils and another aimed toward the mouth and jaw. The headset is compatible with Android and iOS devices.

A prototype and SDK will cost around $100 and a consumer version is expected to cost $200, with shipping to start by the end of this year.

“It is very easy for any app or game developer to use the facial data and apply it to 3D characters in real time,” company co-founder and COO Elia D’Anna told Hypergrid Business. “For this reason, the game developer can choose from any virtual reality SDK — including Daydream — or game development software to start creating content for Veeso.”

Faceshift

(Image courtesy Faceshift.)
(Image courtesy Faceshift.)

Apple’s Faceshift  tracks head orientation, eye gaze, and basic expressions. You first scan a set of expressions to train the personalized avatar for tracking, and you can capture performance in real time and choose to improve in post-processing stage. You can also animate and export to 3D platforms.

All is done with a depth camera and a standard computer windows and MacOS.

Apple doesn’t have a virtual reality product on the market yet, but this could be a first step in that direction.