Leap Motion's Interaction Engine Brings Natural Gestures into Virtual Worlds

Next Reality VR

Leap Motion created gesture control for all sorts of things, including virtual reality, long ago, but developers must build in support for their tracking peripheral to use its full potential. As a result, they've created an "Interaction Engine" for Unity, the primary platform for developing virtual and mixed reality experiences, to try and take gesture interaction to the next level.

11 New #3DJam Demos: Droids and Discovery

Leapmotion

I’ve worked on front-end web applications, middleware, server software and databases, but the most fun I’ve had in recent years has been with the Unity game engine. I’ve been using Unity for over 5 years for developing 3D simulations and virtual world applications, data visualization solutions, and more recently for game projects.”. Designed by a team of students at Game-U ( @gameu_nj ), Nerves is an intense visit to the operating room for a motion-control surgeon.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Open Source Augmented Reality?

VRGuy

Examples of such peripherals could be head trackers, hand and finger sensors (like Leap Motion and SoftKinetic), gesture control devices (such as the Myo armband and the Nod ring), cameras, eye trackers and many others. Provide optimized connectors to popular engines such as Unity and Unreal. AR augmented reality Intel Realsense Leap Motion Myo Nod OSVR Softkinetic

New 4K ‘Spatial Reality Display’ From Sony Has Glasses-Free 3D

Upload VR

A “powerful” Windows PC running either Unity or Unreal Engine is required to actually create the 3D content; Mac support is expected in the future.

Sony 96