Second Life is Looking for Beta Testers for its New Puppetry Feature: Control Your SL Avatar’s Face and Upper Body Movements Using Your Webcam!

This afternoon, Linden Lab (the makers of virtual world Second Life) made an announcement:

Wouldn’t it be cool if you could animate your avatar in real time? What if you could wave your arm and your avatar could mimic your motions?  Or imagine if your avatar could reach out and touch something in-world or perform animations?  Linden Lab is exploring these possibilities with an experimental feature called “Puppetry.”

We have been working on this feature for some time and now we are ready to open it up to the Second Life community for further development and to find out what amazing things our creators will do with this new technology.

The code base is alpha level and does contain its share of rough edges that need refinement, however the project is functionally complete, and it is possible for the scripters and creators of Second Life to start to try it out.

The animated GIF I copied from the Linden Lab announcement didn’t work in my blogpost, so I downloaded the video from their tweet below:

Now, Second Life is not the first flatscreen virtual world to announce such a feature (that would be Sinespace; I wrote about their Avatar Facial Driver back in 2018). At that time, Sinespace said that facial coverings such as glasses might interfere with the tracking. However, four years have passed and I have zero doubt that the technology has improved!

Linden Lab goes on to explain how the Puppetry technology works:

Puppetry accepts target transforms for avatar skeleton bones and uses inverse kinematics (IK) to place the connecting bones in order for the specified bones to reach their targets.  For example the position and orientation “goal” of the hand could be specified and IK would be used to compute how the forearm, elbow, upper arm, and shoulder should be positioned to achieve it. The IK calculation can be tricky to get right and is a work in progress. 

The target data is supplied by a plug-in that runs as a separate process and communicates with the viewer through the LLSD Event API Plug-in (LEAP) system.  This is a lesser known functionality of the Viewer which has been around for a while but has, until now, only been used for automated test and update purposes.

The Viewer transmits the Puppetry data to the region server, which broadcasts it to other Puppetry capable Viewers nearby.  The receiving Viewers use the same IK calculations to animate avatars in view.

For more details about the Puppetry technology, take a look at the Knowledge Base article Puppetry : How it Works

To my knowledge, this marks a major change in how avatars move in Second Life. One of the things which the newer generation of metaverse platform users (much more used to social VR platforms like VRChat) have found odd is that SL avatars rely so much on the playback of pre-recorded animations. (Keep in mind that SL does not support users in VR headsets, as it cannot reach the necessary frame rates to avoid VR sickness! There have been valiant attempts made over the years, however.)

If you are intrigued by this development and want to test it out for yourself, here are the details (it does sound as though you will need to be a bit of a computer geek to participate, at least in this open beta test period!):

The Puppetry feature requires a project viewer and can only be used on supporting Regions.  Download the project Viewer at the Alternate Viewers page.  Regions with Puppetry support exist on the  Second Life Preview Grid and are named: Bunraku, Marionette, and Castelet.

When using the Puppetry Viewer in one of those regions, if someone there is sending Puppetry data you should see their avatar animated accordingly.  To control your own avatar with Puppetry it’s a bit more work to set up the system.  You need: a working Python3 installation, a plug-in script to run, and any Python modules it requires.  If you are interested and adventurous: please give it a try.   More detailed instructions can be found on the Puppetry Development page.

We look forward to seeing what our creators do with the new Puppetry technology. Compared to other features we have introduced, it’s quite experimental and rough around the edges, so please be patient!  We will keep refining it, but before we go further we wanted to get our residents’ thoughts.

We will be hosting an open discussion inworld on Thursday, Sept 8 1:00PM SLT at the Bunraku, Marionette, and Castelet regions on the Preview Grid.    We’re also happy to talk about this at the upcoming Server User Group or Content Creator meetings.  Come by, let us know what you think, and hear about our future plans!

I for one will be quite excited to test this new feature out!

Liked it? Then please consider supporting Ryan Schultz on Patreon! Even as little as US$1 a month unlocks exclusive patron benefits. Thank you!
Become a patron at Patreon!