VR Interaction Frameworks vs Interaction Builder

--

The market in terms of VR creation tools is constantly expanding with more and more valuable offers and it may be difficult to understand different products to find which one would suite your project the best. To illustrate this point we will compare two looking alike products that you can use to create interactions for VR content: the VR Interaction Framework, a well-reviewed Unity Plugin known to be user friendly, and the Interaction Builder from the Interhaptics Suite.

Why these two?

Because they are similar in many points such as framework complexity, coding complexity, and what values they want to provide to developers. Both are built on 3 pillars: define an interactive/grabble object, define the body part/grabber, and events related to each interaction. This is fundamental for hand interaction such as the hand tracking, snapping and grabbing in virtual reality. Whether it is with the Interaction Builder or the VR Interaction Framework, your coding knowledge will not be under pressure to understand and create interactions. However, you will always have API/framework access to go more in-depth during your development process.

Specifications

Of course, it would be lying to say that they are identical. Some features look similar but are not built the same way and for the same purpose. Some features are completely different.

Trending AR VR Articles:

1. Designing for a modern 3D world: A UX design guide for VR

2. Scripting Javascript Promise In Spark AR For Beginners

3. Build your first HoloLens 2 Application with Unity and MRTK 2.3.0

4. Virtual Reality: Do We Live In Our Brain’s Simulation Of The World?

Starting with the VR Interaction Framework, you will find an easy to use grabber/grabbable system. It is not optimized for precise hand interactions but will work well for any interaction based on grabbing. Also, as it is not focused on hands, there is a dissociation between “what is a hand” and “what can grab” which gives some flexibility. VR Interaction Framework is mainly using button inputs to trigger interactions and events during interactions. It is also compatible with Oculus Quest hand tracking (precisely the Pinch input to simulate the grab). Counterpart is the loss of consistency between hand tracking input and controllers’ input. The interactions are half logic-based and half physic-based. Interactions are triggered logically (with a specific input/value) but the behavior of the object is managed by the physics engine which requires a bit of extra preparation on the object itself with physics components such as joints.

On the other hand, you will find the Interaction Builder. It is focused on hand interactions on a detailed level, which means a better definition of palm and fingers to have precise interactions based on body parts. As it is working around hands themselves, the Interaction Builder is not using button inputs to trigger interactions or events, but a grab strength computed with the skeleton of the hand. It gives a realistic representation of a “grab” and makes these interactions work consistently with both hand tracking and controllers. Everything is computed in a logic engine. No complex physics component is required to create an interactive object (usually you would use physical joints and configure them).

Each interaction is triggered logically and the behavior of the object in the 3D space is also logically managed therefore you just need to apply script and the object will behave as it should.
A small detail that is actually important in terms of user experience is the differentiation between big and small objects. It was a necessity since we do not grab objects the same way depending on their size (have more details about different ways to grab here).

Comparison table

If your project does not require fully realistic interactions and has more of a gaming approach, the VR Interaction Framework will easily help you in that way, specifically if you want to fully work in a physics-based environment. However, from its concept around the hand and a logical engine for stability, the Interaction Builder would be a default choice for any serious game, business, or professional training (e.g. any VR content requiring realistic hand interactions or VR content around Hand Tracking). Also, if you do not want to learn and spend time on the physics side of interactions, the Interaction builder will handle everything for you in a logical engine.

Test right now the Interaction Builder by downloading the Interhaptics Suite. Check our last blog post here for more news.

Don’t forget to give us your 👏 !

--

--

Interhaptics is a development suite designed to build and create realistic human like interactions as well as haptics feedback for 3D application in XR