Czech developer iNFINITE Production has released UVRF – a free, cross-platform template for hand presence in VR. The open-source demo offers a framework for use in any Unreal Engine project, as well as a ‘Playground’ scene containing an underground bunker and shooting range to showcase hand interactivity.

Detailed in a post on the Unreal Engine VR developer forum, UVRF’s framework aims to be a useful starting point for implementing hand presence in an Unreal-based VR experience, offering 17 grab animations to cover most objects, per-platform input mapping and logic, basic haptics, teleport locomotion using NavMesh (with rotation support on Rift), touch UI elements, and several other useful features. The framework is released under the CC0 license, meaning it can be used by anyone without restriction.

In a message to Road to VR, Jan Horský at iNFINITE Production explained how this template could be particularly useful to new developers. “While Unreal does very good job at making development accessible, building hands that properly animate, are properly positioned, with grabs and throws that feel natural and so on, is still not a trivial task,” he writes. “While it’s not a problem for experienced dev teams, it is a problem for newcomers. And they’re the ones that are likely to have ideas that will surprise us all. This little demo is an attempt to make VR development easier for them.”

SEE ALSO
'Job Simulator' and the Magic of Hand Presence

The included ‘Playground’ demo shown in the video features a functional shooting range in an underground bunker, littered with magazines to show the multi-object interaction of reloading a gun, along with many other features to highlight the hand animations.

Originally developed as an internal tool for prototyping at iNFINITE Production, the team decided to kindly share it with the world. “I expected such a project would come from companies that are more interested in VR growth like Oculus, Valve, or HTC,” says Horský. “It’s nearly a year since Touch was released and there is still no such thing publicly available, so we decided to take it into our own hands.”

You can download to the template here.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


The trial version of Microsoft’s Monster Truck Madness probably had something to do with it. And certainly the original Super Mario Kart and Gran Turismo. A car nut from an early age, Dominic was always drawn to racing games above all other genres. Now a seasoned driving simulation enthusiast, and former editor of Sim Racer magazine, Dominic has followed virtual reality developments with keen interest, as cockpit-based simulation is a perfect match for the technology. Conditions could hardly be more ideal, a scientist once said. Writing about simulators lead him to Road to VR, whose broad coverage of the industry revealed the bigger picture and limitless potential of the medium. Passionate about technology and a lifelong PC gamer, Dominic suffers from the ‘tweak for days’ PC gaming condition, where he plays the same section over and over at every possible combination of visual settings to find the right balance between fidelity and performance. Based within The Fens of Lincolnshire (it’s very flat), Dominic can sometimes be found marvelling at the real world’s ‘draw distance’, wishing virtual technologies would catch up.
  • Great that you have shared this! Will go take a look.

    I agree that the major VR suppliers should have sponsored something like this from the outset. It doesn’t even have to be engine specific so it covers all engines and all platforms, mostly theory backed up with trials, truly an open project for the greater good of VR interfacing so developers can start coding gameplay rather than re-inventing the wheel when it comes to the technicalities of coding input.

    This would make a good website for somebody to collate all this material together and create a community, offer (sponsored) challenges for users to come up with improved systems for specific areas.

    There are so many aspects of VR interaction, and that is always evolving too (wands to knuckles, tracking etc)

    We have PSVR, Windows MR, mobile and single entities like Vive/Rift/Pimax, soon we will have pure AR.

    Then we have the different languages, C++, C#, Swift etc along with engine SDK’s

    Different tracking systems, inside out, outside in, hand tracking, body tracking, eye tracking, head tracking, face tracking.

    Then all the different movements:
    Walking, running, climbing, swimming, crawling, jumping, sneaking, hiding, grappling, gliding, skiing, cycling, driving, interfacing etc and these can be seated, standing or some other external gadget.

    We also have formulas to reduce motion sickness or reduce demand like screen fading on faster motion, snap turning, horizon balancing, walling in, foveated rendering, time warps and all the theory that goes with it. It would be beneficial to all so early in the VR era if this was shared rather than closed.

    I realise I am rambling a bit here, maybe a forum / site like this already exists, if not then it should.

  • Justos

    What a great idea. I toyed with unreal and the hands were pretty bad other than tracking position and orientation. This will help a lot of people!

  • Firestorm185

    Can’t wait until something like this comes to Unity. I have UE4 but have much more experience in Unity. >w<

    • Walextheone

      Same here, same here

      • Firestorm185

        Yeah, after all, VRTK is on Unity. XD

  • Lucidfeuer

    I guess they can thank Read at Dawn for Lone Echo. And we can thank them for making it open-source.

    • Guygasm

      This doesn’t even come close to the hand IK that RAD has implemented.

      • ENiKS

        Yeah, the goal was to create something like Toybox or First Contact than Lone Echo. With environment detection and reactions to it (haptics, IK) being #1 requested feature, it’s literally what I’m working on right now :)
        Jan Horsky aka ENiKS
        PS. Thanks for sharing Road to VR

        • Lucidfeuer

          By environment detection do you mean real-world physical object interactions? And by haptic, is it tied into Oculus Touch or Vive Wands SDKs?

      • Lucidfeuer

        Not the same budget or scope, the fact that it is a free and open-source tool is more important as it can be concurrently iterated.

  • Bramagola

    because nothing says “reality” like when my hand snaps to my keyboard, or cellphone, or chair, or zipper….

    I dont think you know what “realistic” means.

    • stunsound

      Seriously, what are you hoping to contribute with your comment? Firstly, the title says “more realistic” which it definitely is compared to the default UE4 hands. Secondly, this is an open source, free to use, out of the goodness of their own hearts contribution to the indie VR dev community. They don’t owe, you, me, or anyone else out there anything. If you don’t like it, don’t use it. But whatever dude, if s**ting on other (more talented) people’s work makes you feel better about yourself, then have it…

  • Abigail Svardington

    It is high time