News

Animating VR Characters in Google Blocks

Last month, Google launched their Blocks VR creation tool, enabling anyone to 3D model low polygon objects with no prior modeling experience required. Although it seemed like a simple app on the surface, after diving in, it became immediately apparent the massive potential Blocks can have as a professional VR creation tool.

Aside from being able to use the objects or characters that you create for other applications, like bringing them into Unity and using them for VR games, we’ve been wondering if we’ll one day also get the chance use Blocks for 3D animation.

Now Google’s Daydream Labs is giving us a glimpse into just that. During a one-week hackathon, the Daydream Labs team took it upon themselves to prototype a way to make Blocks scenes feel dynamic and alive. And what better way to do that then by animating 3D models.

Daydream Labs is a way for the Google VR team to explore different applications and interactions for virtual reality, pairing engineers with designers to rapidly prototype concepts. They’ve been sharing their learnings with the VR community for awhile, so it shouldn’t come as a surprise to see the team share their process behind animating 3D models as well.

Shared on Google VR’s blog, Senior UX Engineer Logan Olson laid out the three-step process used to get Blocks characters dancing and jamming in VR. You can read the full process here.

Step One: Preparing the Model

Before animating a character in Blocks, the Daydream Labs team explored two methods: inverse kinematics and shape matching. Inverse kinematics is a technique used for animating characters in video games, and in this case, allowed the team to just move a hand or a foot while the rest of the character’s body position adapted.

Shape matching is a newer technique for characters with less well-defined physiques. In the case of this demo, a blobby chair or a boombox with legs, allowing for some jiggly movement that adds character and playfulness.

Step Two: Controlling the Model

Now that the model is prepped, the team shows us how VR can then help move it.

Direct control of the character is made possible by tracking of the headset and controllers, as shown in this virtual mirror of your character.

Vive trackers on your feet can add even more control, letting your legs loose.

Then there’s the ability to grab specific points and manipulate them. Watch that chair’s gums go!

The team even tested multiple players, both animating collaboratively together in a shared environment. All of which is taking us back to our childhood, playing with toys, and maybe for some of you, posing dolls.

Step Three: Recording Motion

Then comes the last part in the process, recording and playing back movements.

Pose-to-pose animation was used, similar to current 3D animation techniques, and works for complex movements like jumping into a chair.

And for simpler animations, live looping lets you record an object’s movements in real-time and then play them back as a repeating loop. It’s as simple as pressing record, animating, and pressing stop, to watch your animation come to life in a loop.

As more and more VR creators integrate Blocks and Tilt Brush into their workflow, and as Google continues to evolve their products with new updates, it’s only a matter time before the next big blockbuster animation film is created in VR—and maybe even by a college student in their dorm room.

Image Credit: Google VR

About the Scout

Jonathan Nafarrete

Jonathan Nafarrete is the co-founder of VRScout.

Send this to a friend