Leap Motion has announced it’s to early access to the beta of its Interaction Engine, a set of systems designed to help developers implement compelling, realistic and hopefully frustration-free input with just their hands.

If you’ve ever spent time with a physics-sandbox title, you’ll know that a large part of the appeal is the satisfaction and freedom to play within a virtual world that behaves somewhat like reality – with none of the real-world restrictions applied. But this presents myriad problems, not least of which is that those real-world modelled physics breakdown when physicality is removed. Without physical boundaries in place, objects on the virtual plane will behave according to the digital physics model, right up to the point you accidentally put your digital self through said objects – at which point things kinda breakdown.

maxresdefault (2)

These issues are particularly acute when it comes to integrating naturalistic hand interaction with a digital space and its objects, for example in VR. Bridging the “gray area” between accuracy and what ‘feels good’ to a human being is part of that elusive magic when you encounter an input interface that just works. More specifically, in the case of VR, that bridging involves implementing an alternative set of rules when a player connects and grasps a virtual object in 3D space, bending realities rules in favour of a visual experience that more closely matches our expectations of what should happen.
[gfycat data_id=”AdventurousAncientBrahmanbull” data_autoplay=true data_controls=false]
These are all issues that Leap Motion, the company most well known for its depth sensing sensor peripheral of the same name, have been grappling with for many months now and they’re Interaction Engine aims to remove a lot of the pain for developers by providing a framework that “exists between the Unity game engine and real-world hand physics,”[gfycat data_id=”UnripeFancyAndeancockoftherock” data_autoplay=true data_controls=false]
The last time we encountered Leap Motion, they showed us the first glimpses of their work to try and boil down such an enormously complex set of problems into something that developers can interface with easily. At CES in January the Leap Motion team let us get our hands on Orion with an early verison of their Interaction Engine, a significant milestone for the company in terms of their overall tracking framework with impressive leaps in lowered tracking latency and the systems ability to handle hand tracking issue

SEE ALSO
Pico Reportedly Cancels Quest Competitor to Instead Take on Apple Vision Pro

Leap Motion’s release of Interaction Engine’s Beta completes another piece of the peripheral-free VR input puzzle that the company has dedicated itself to over the last couple of years.

“The Interaction Engine is designed to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them in a way that feels truly solid. It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions.”

Leap Motion have always had a knack for presenting complex ideas involved in their work in a visual way immediately graspable by the viewer. These latest demo’s illustrate that user-friendly fuzzy logic Leap Motion believe strike a neat balance between believable virtual reality and frustration-free human-digital interaction.

The Interaction Engine represents another milestone for Leap Motion on its quest to achieve hardware free, truly human input. And if you’re a developer, it’s something you can get your hands on right now as the beta is available for download here, read all about it here and chat with others doing the same.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Based in the UK, Paul has been immersed in interactive entertainment for the best part of 27 years and has followed advances in gaming with a passionate fervour. His obsession with graphical fidelity over the years has had him branded a ‘graphics whore’ (which he views as the highest compliment) more than once and he holds a particular candle for the dream of the ultimate immersive gaming experience. Having followed and been disappointed by the original VR explosion of the 90s, he then founded RiftVR.com to follow the new and exciting prospect of the rebirth of VR in products like the Oculus Rift. Paul joined forces with Ben to help build the new Road to VR in preparation for what he sees as VR’s coming of age over the next few years.
  • Bryan Ischo

    The fidelity just isn’t high enough for sustained presence. It can still be fun though; the Leap Motion demos were really enjoyable on the DK2.

    Watching YouTube videos of Onward with people’s models doing all kinds of weird contortions is fun and nobody seems to mind while playing. Then again, if it starts to impact the fidelity of the game control vs. just being a funny model glitch, it becomes unacceptable.

    • Klasodeth

      My biggest problem with Leap Motion is that even if the hand tracking were somehow perfect, it all breaks down the moment the hands go out of view of the sensor. In various Vive games, I routinely perform actions that take my hands out of view. In Raw Data for instance, I reach over my shoulder to grab a sword, and reach down to a belt in order to grab or holster a pistol or magazine. Performing those actions and more with the constantly tracked Vive controllers is no problem when you’re not looking at your hands, but with Leap Motion, those same actions would be impossible.

      • Bryan Ischo

        Probably games could be designed to use normal hand tracking when your hands are outside of visual range since you’re not likely to be doing detailed finger work when you can’t see your hands. But it would definitely be a compromise and would require external sensors in addition to the Leap Motion.

        All in all while I do like the Leap Motion, I think that it won’t really make any significant inroads unless its fidelity improves 10x and the sensors can be placed externally so that the issue you mentioned doesn’t have to be worked around.

      • dogbite

        Yep
        With what can be done in 360 roomscale on Touch and Grip In VR gaming, players want to feel the things they see in there hands like guns, swords and such and the haptics cues as well so I doubt the game adoption will be significant in general terms. There may be applications like simulators where it would be functional, though. Such as how it marries with Fly Inside and Flightsim in VR. It would resolves the issues like having to take your hands of say a yoke or stick and slip on another controller to push buttons and flip switches, etc. Since the sim in the gaming world with the most clickable cockpits (DCS World) hasn’t indicated any intentions to go there, I suspect the applications may be in the commercial realm. Time will tell.

    • Leap Motion

      Have you had a chance to try the Orion beta software, especially some of the more recent versions?

      • Bryan Ischo

        No, admittedly I haven’t tried it since I shelved my DK2 in favor of my Vive in early May. But now you’ve got me interested, I should fire it up again and see how much better things have gotten.

        I truly want the product to succeed, I think the promise is awesome, and it can be loads of fun to use as I have pointed out before.

      • OgreTactics

        The Leap has been out for what 3 years? It’s been a great advancement, but isn’t there a new wider angle and more accurate Leap to be released…?

        For me it’s always been a “perfect” and satisfactory solution to be integrated in VR, and it’s a huge mistake for market adoption not to have included it in the consumer headsets.

        The reason being that nobody ever said that hand-motion had to be 1:1 real-life accurate from the beginning, there are tons of algorithms or tricked to be used like, semantic gestures library, or hand placement latency correction etc…