Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR

37

A broad team of graphics researchers, universities, and technology companies are showcasing the latest research into digital human representation in VR at SIGGRAPH 2017. Advanced capture, rigging, and rendering techniques have resulted in an impressive new bar for the art of recreating the human likeness inside of a computer in real-time.

MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.

The result is a rather stunning representation of Seymour—rendered at 90 FPS in VR using Epic’s Unreal Engine—standing up to extreme scrutiny, with shots showing detailed eyebrows and eyelashes, intricate specular highlights on the pores of the skin, and a detailed facial model.

To achieve this, Seymour wears a Technoprops stereo camera rig which watches his face as it moves. The images of the face are tracked and solved with technology from Cubic Motion, and that data is relayed to a facial rig created by 3Lateral and based on a scan of Seymour created as part of the Wikihuman project at USC-ICT. Seymour’s fxguide further details the project:

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head mesh.
  • The system uses complex traditional software design and three deep learning AI engines.

A paper published by Seymour and Epic Games researchers Chris Evans and Kim Libreri titled Meet Mike: Epic Avatars offers more background on the project.

SEE ALSO
New Update Adds 'Lying Down Mode' to Quest 2 & Quest Pro, 'Quest Cash' Parental Payments

From our reading of the project, it’s somewhat unclear but sounds like the rendering of the digital Seymour is being done on one PC with a GTX 1080 Ti GPU and 32GB of RAM, while other computers accompany the setup to allow the host’s guest and several audience members to view the scene in VR. We’ve reached out to confirm the exact hardware and rendering setup.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Raphael

    Needs more polygons.

    Only kidding. This will push back star citizen squadron 42 release by another five years.

    • Mei Ling

      Right now they’re working very hard on the toilet bowl mechanics.

  • GigaSora

    We’ve finally done it. We’ve hit the uncanny valley!

  • cirby

    Now, if they could just stop him from blinking so much…

    • revel911

      That was my thought as well, the blinking felt not just too much, but random.

      • Jonny

        The blinking mimics the actual “actor”. It’s not something generated at random.

    • Andrew Jakobs

      The blinking was following the actual person who was tracked, so he blinks a lot, as many people do when talking.

    • ricktowers

      If you look at the real Mike when he does his videos on FXPHD he actually blinks in real life that often.

  • Tucci

    i think they may have crossed the UV, I’m not creeped out by this one

    • Greet

      If it’s creepy they didn’t cross it, they fell right in

  • Muzufuzo

    According to the source, “This remarkable journey has been possible by a brilliant group of collaborators (and NINE PC’s with 32Gig RAM each and 1080ti Nvidia cards). “.

    • benz145

      “From our reading of the project, it’s somewhat unclear but sounds like the rendering of the digital Seymour is being done on one PC with a GTX 1080 Ti GPU and 32GB of RAM, *while other computers accompany the setup to allow the host’s guest and several audience members to view the scene in VR.*”

  • Joe

    Skin looks pretty good, but has too much shininess, which seems to be a problem with a lot of virtual humans. The teeth shader is way off, needs better subsurface scattering.

    • Andrew Jakobs

      Too much shininess? Have you ever seen people without makeup to prevent the shininess? If i didn’t know before this was cgi, i would certainly not have noticed it, and even when i did, i had a hard time seeing it. Even the theeth seem believable, as so many people have such different theeth.
      Best way would have been to do the same video with the actual actor side by side.

    • Rafael

      That is what I thought. The rest, is Ok.

  • Aaron

    Still not quite there but the progress is amazing!

  • Ted Joseph

    Lets get a 180 FOV, wireless, high resolution, light, low cost VR headset first shall we?

    • Andrew Jakobs

      That wouldn’t be able to handle the data at this point. Higher resolution isn’t really the problem, getting the gpu’s to render it at the needed fidelity and framerate at a decend cost is the problem. A gtx1080ti is at the highend now and that one already has problems to render it at the fidelity people expect on the current headsets. Once the price of the power of that gpu has dropped to ‘low’ budget ($200 or less, yes $200 is considered ‘low’cost these days), then higher resolutions are beginning to start to be more interesting. Foveated rendering can do a lot for that, but it isn’t a wonderdrink..

    • Edawg

      Let’s cure world hunger shall we?
      Just trying to be as off-topic as you…

  • Casey

    Uncanny Valley-crossing or not, this is still infinitely better than what standard VR avatars have been up to this point. I think VR social apps can definitely become incredibly popular as the ultimate long-distance form of communication once they actually depict a person the way they look in real life, rather than as a Miibo or whatever.

    • Andrew Jakobs

      This looked even better than Tarkin (which i still think looked great).

  • Andrew Jakobs

    WTF…. That really looked so unbelievably good (on my 5″ phone). Even the eyes looked real.

  • Impressive. Live mocap for heads. Cut-scenes in games which use hard-coded motion capture could now achieve the effect in minutes instead of days using manual rigging. VR is pushing technology in all sorts of interesting ways.

  • Alorwin

    This isn’t in the uncanny valley so much as it is right at the edge. It’s unreal enough that I’m not disturbed.

  • Carl Galilee

    Impressive that this level of detail is realtime 90fps.

    Still a long way to go for any digital face to get the eyes right. Even digital emily and ira have dead eyes. Its interesting when we’re reminded of the amount of information and queues we get from someones eyes that we totally take for granted.

    Just wonder if we’ll ever be able to completely trick our minds looking through digital windows without a soul.

  • Matias Nassi

    Pretty amazing besides Uncanney Valley being still present (btw, the link to the paper is broken, correct link here just in case: http://dl.acm.org/citation.cfm?id=3089276)

    • benz145

      Thanks for spotting this, looks like they don’t allow direct linking to the PDF file. I’ll add this link instead.

  • ZenInsight

    Something still wrong with the mouth rendering.

  • chuan_l

    — This ” Mike ” I like !

  • impurekind

    That is impressive.

  • Elecman

    No tongue. That is what’s missing in all real time renderings. It just looks weird. Other than that, great work.

  • zintax

    This is Asymptote at it’s best. Of course this is exciting, however, no mater how close tech gets to that feeling of real It feels almost impossible to make rendered humans indistinguishable from a real human under the right conditions. Textures and polygons are not the problem, that’s been there for quite some time (how ’bout them pores!). Sure, for a quick face replacement in a scene or a poster.. fine, but have him talk for 2 mins, and you’re made.

  • Nein

    >440,000 triangles with 75% being hair
    Doesn’t sound like responsible triangle distribution. If you make him bald you can probably make a game.

  • Ragbone

    Does this mean my work colleagues will be able to tell that I am on the toilet during conference calls?

  • It is still noticeable that this is CGI…but…wow!

  • Stepan Stulov

    Teeth still look out of place alien.

  • Samuel Thomas

    Need more bewbs.