Archives

Categories

Augmented RealityEventsNews

Snapchat Lens Creators Virtually Gather for Lens Fest 2020

Thousands of creators peaked into the future for Snapchat’s millions of users.

 

Snap held its annual Lens Fest remotely this year from December 8 to December 11. Lens Fest is an opportunity for Snapchat’s growing community of lens creators to interact, hear from other artists, and learn about new and upcoming updates to Snap’s software and hardware.

See Also:  Squad Partners With Snap to Create Augmented Reality Creative Outlets

The three-day event focused on the world’s largest social AR platform’s depth sensing, machine learning, and 5G developments in the most recent iterations of Lens Studio and Spectacles. It also showcased some of the most popular and moving lenses created by Snapchat’s creator community.

“We are here together to celebrate you, our global community of lens creators,” said Snap’s Eitan Pilipski. “We are learning so much together, and we believe there is an exciting world ahead.”

Recent and Upcoming Updates to Lens Studio

Much of Lens Fest focused on the most recent updates to Lens Studio, which came out on the first day of the conference. These updates immediately impact Snapchat’s lens creators but will also come to impact Snapchat’s considerably larger non-creator userbase.

Snapchat Snap Lens Fest

The updates include some comparatively simple solutions like resource and asset compression, and log filters. It also brings visual scripting, which replaces lines of text with moveable and connectable units.

“Visual scripting is a very cool feature which allows you to access all of the features of Lens studio … but you don’t need to write a single line [of code],” said Snap’s Artem Yerofieiev. “It’s a powerful feature, especially if you are not very familiar with how scripting works.”

As there is no substitute for education, the updates also include new web resources including building blocks, helper scripts, and new effects. There are also four new templates:

  • A Face Morph Template for using an external mesh to distort a user’s face;
  • A Configuration Template;
  • A Tween Template and menus;
  • A Scavenger Hunt Template.

Presenters also introduced features that are expected to be coming to Snapchat in the near future, including multi-user experiences, 3D full-body tracking, and Local Lenses – an expansion of the current Landmarkers feature.

“We really can’t wait to put the next generation of creator tools into your hands,” said Qi Pan, Snap’s Director of Computer Vision.

Machine Learning, Depth Sensing, and AR

Many of Snapchat’s recent and upcoming updates are the result of advances and exploration of two key fields, depth sensing and machine learning. The new Spectacles has two cameras, allowing for the same kind of depth-sensing that is involved in most gesture-tracking enabled VR headsets. Even if you don’t have Spectacles, newer model phones also have depth sensing.

“Nowadays, more and more devices have depth features and at Snap, we love to play around with the newest tech,” said Snap’s Vivian Su.

See Also:  Snap’s Lens Studio Gets Update for Full-Body Social AR Effects

Snap partnered with Apple on their most recent line of LiDAR-enabled devices, but they also have developer tools for Android devices that use a different method to determine depth.

Simple applications only need to recognize depth but understanding depth on a deeper level allows for more immersive experiences.

“We’re using deep learning to automatically identify several common elements in images,” said Snap Computer Vision Engineer Ed Rosten.”Deep learning can also be used to transform an image by understanding what elements are present in an image and how they should behave.”

Snapchat Lenses With Social Impact

Snapchat Lens creators aren’t (just) coders, they’re artists. And a recurring theme of the event was lenses crafted with a social impact. A special segment, “Our Global Community: Lenses with Social Impact” took a moment to discuss some of them.

“We have a great opportunity to use our lenses for good,” said Snap motion designer Marcio Lima. “Everybody has a culture, everybody has a uniqueness to them, and we need to shed light on that.”

Snap Lens Fest Snapchat

Experiences showcased included lenses that help people understand and express themselves through masks, learn new languages with machine learning, and experience and participate in social justice movements.

“AR is a very useful tool to share a message while also making people a part of it; they aren’t just receiving it,” said lens creator Jimena Depresbiteris.

“Turn Your Camera Out Toward the World”

For all of the talk about machine learning, depth sensing, meshes, assets, and scripting, the human message was at the bottom of most of Lens Fest, with conversations always returning to one form or another of the humble camera.

As we keep building our platform, we strive for the camera to be the starting point,” said Snap’s Bobby Murphy. “We want you to turn your camera out toward the world and envision how you will transform the world around you.” 

Jon Jaehnig
the authorJon Jaehnig
Jon Jaehnig is a freelance journalist with special interest in emerging technologies. Jon has a degree in Scientific and Technical Communication from Michigan Technological University and lives in Michigan’s Upper Peninsula. If you have a story suggestion for Jon, you may contact him here.