article thumbnail

Highlights From the 2021 Snap Partner Summit

ARPost

Lens creators also have access to new machine learning capabilities including 3D Body Mesh and Cloth Simulation, as well as reactive audio. In addition to recognizing over 500 categories of objects, Snap gives lens creators the ability to import their own custom machine learning models. Lego Connected Lenses.

article thumbnail

Top 5 Use Cases For Hand Eye Tracking Technology

XR Today - Virtual Reality

In the place of clunky plastic controllers and remotes, we’re seeing the rise of hand and eye tracking solutions, allowing users to move more freely throughout the digital world. The data collection capabilities of eye and hand tracking tools are particularly beneficial for the retail and marketing sectors.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

ManoMotion Brings Hand Gesture Input to Apple’s ARKit

Road to VR

ManoMotion, a computer-vision and machine learning company, today announced they’re integrated their company’s smartphone-based gesture control with Apple’s augmented reality developer tool ARKit , making it possible to bring basic hand-tracking into AR with only the use of the smartphone’s onboard processors and camera.

article thumbnail

WebAR to Wearables: AR’s 2020 Outlook

AR Insider

Spatial computing made its mark as a tool for learning, training and development: As we’ve become increasingly familiar with the positive effects AR has on attention and memory encoding, it was exciting to see AR’s adoption expand outside of a marketing context.

Wearables 209
article thumbnail

AR and MR Headsets and Glasses 2019 and 2020 Overview

ARPost

Microsoft set machine learning to design the MR headset to be as usable as possible despite the fact that it works entirely using gesture controls. Each has a unique configuration of software and hardware including external tools and sensors for different jobs.

AR 238
article thumbnail

How Brain-Computer Interfaces Can Deliver On VR’s Promises

UploadVR Between Realities podcast

But this poses a tricky problem for MR headsets: how should users interact with a machine that they’re wearing on their faces? Gribetz often speaks of a “zero-learning curve” computer, a machine that is so intuitive that you’ve always known how to use it. Gesture controls preclude hands-free operation.

article thumbnail

How Are XR Firms Leveraging Artificial Intelligence? 

XR Today - Mixed Reality tag

AI allows users to interact with the hardware and software in the XR landscape more effectively and paves the way for everything from gesture control to haptic feedback. The customer avatar would be able to respond differently to each action the agent takes, allowing for a more realistic learning experience.