//How to Analyze Virtual Reality (VR) Heatmaps and Analytics

How to Analyze Virtual Reality (VR) Heatmaps and Analytics

How to Analyze Virtual Reality (VR) Heatmaps and Analytics

Many organizations now use virtual reality as an integral part of their business. They’re making VR experiences for employee training, sales/marketing presentations, recruiting, communications and more. As VR applications take on a more prominent role in organizations, so too does the data generated from the apps — specifically heatmaps and analytics.

InstaVR provides a wealth of data to our Pro and Enterprise subscribers. Heatmaps are not only visually displayed in the Console, but also exportable and able to be parsed by date range. Analytics shows just about every conceivable data point — views, device types, scenes viewed, hotspots/navigation links interacted with, time of each of these interactions, etc. And analytics, just like heatmaps, are fully exportable for sharing with others in your organization.

We’re going to below do a deep dive into VR heatmaps and analytics to help you maximize your use of these powerful tools. We’re also proud to announce our new InstaVR Success Report — a weekly email all InstaVR Pro and InstaVR Enterprise users receive with a summary of the activity related to their VR applications. This is a great automated report that gives you an executive level view of your VR initiatives.

If you have any questions on VR heatmaps or analytics, don’t hesitate to reach out to our Sales or Customer Success teams. Enjoy!

How to Analyze VR Heatmaps

How Heatmaps Work

We released Version 2 of our heatmaps feature in late 2017. The largest change was we that broke the heatmap panorama visually into quadrants — basically squares arranged in a 7 x 12 grid  — to more easily display the user focus. This change also allows for exporting to a grid CSV, something you can do in addition to exporting as an image file.

Heatmaps can be accessed through your main InstaVR Console, by clicking the “Heatmap” button in the upper right. Heatmaps can be created for both 360-degree images and videos. The numbers & colors correlate to aggregate focus, on a 1-10 scale. If one user looks at a part of a scene for 1 second, for example, and the second user looks at the same area for 10 seconds, that would add up to 11 seconds. The 1-10 scale is used to normalize the data to a common scale.

InstaVR Pro and Enterprise users can also use custom date ranges for their heatmaps. They can select to visualize gaze focus over a day, a week, a month, or a custom date range they define.

Note: each heatmap is project specific and only accumulates data if the user has a device or headset that is online to pass data to us.

How to Analyze the Heatmap Data

Quick story: when we rolled out VR180 compatibility last year, we looked at heatmaps to determine how effective VR180 as a format would be, particularly for employee training. The answer is “very”! We know this because the heatmaps showed much more intense focus on the front 180-degree field of view of a scene — based on the initial focal point — and considerably less in the back 180-degree FOV. This matches up with what we intuitively thought, and confirms our belief that in a 360-degree scene, creators should spend more time perfecting the action that takes place in the immediate front viewing area.

Back to heatmap analysis…

There’s many things you can do with a heatmap, including:

– Use it to improve your VR. If you’re doing employee training, and employees consistently are not looking at an important area of a scene, you need to call it to attention either through a hotspot or voiceover narration. Or by starting the initial POV on the area that is not getting enough attention.

– Use it to test employees and viewers. We see many academic and corporate researchers of InstaVR accessing the heatmap feature. Why? Because it provides a lot of information on where viewer attention is. If you’re training new doctors to avoid distractions through VR, like they’re doing at Stanford University School of Medicine, you can see which of the distractions elicit the most focus vs. the least. And then in future trainings, really focus on getting the doctor to ignore the most attention-grabbing distractions. Another similar use case we’ve seen is consumer products. You can test different packaging and store layouts to see if your products get more attention in a simulated VR shopping experience.

Ultimately, heatmaps are better than self-reporting by users. It gives you actual data on where they are looking. Oftentimes, people don’t even notice what is getting their attention while in a VR experience, but the heatmaps will show it.

How to Analyze VR Analytics

How Analytics Work

Any time a user accesses your VR project created with InstaVR — as long as the device connects to the Internet — we’ll be pinged back with a wealth of data pertinent to their interaction. In a sense, this is very similar to web analytics, if you’ve ever used them before.

You can access your analytics on a per project basis by selecting “Report” from the drop-down in the upper right of your Console.

The analytics section encompasses: Number of Devices, Type of Device or Operating System, Number of Scenes Viewed, Total Viewing Hours, and then every single action taken by users (Scenes, Navigation Links, Hotspots, Calls-to-Action). All of the interaction data is timestamped, so if you know who was using your app at a certain time, you can pinpoint their actions exactly (more on that later).

Like with heatmaps, Pro and Enterprise users can do customized data ranges for the analytics — daily, weekly, monthly, or custom. The full report can also be exported to CSV, for offline analysis.

How to Analyze Your VR Analytics

We’re presenting so much data to you in Reports. But how can you take action on it? The following are just a few of the most popular things to analyze within this section of the Console.

– What devices are being used to access your VR? Looking at the device types helps you to understand how your audiences access your app. For example, if you’ve posted your completed app to iTunes, Google Play, Web, and the Oculus Store — you can see exactly which of these channels gets the most plays. Based on that data, you can then come up with a more focused marketing strategy.

You can also make decisions based on device type. For instance, as we’ve discussed previously, WebVR is a much better fit for 360-degree images than 360-degree videos, currently. So if you see most of your users on Web (vs. mobile app), you might focus more of your VR development on 360-degree images. Or the reverse could be true, and if app downloads are popular for you, you can concentrate more on developing 360-degree videos.

– How effective is the layout of your VR? What scenes do people want to view? The very granular interaction data gives two important indicators: how well you’ve laid out your VR experience and what scenes are popular. For instance, if you have an intended path you’d like users to take (Scene 1-> Scene 2 -> Scene 3 -> Scene 4), you can see where in the funnel they’re dropping out. That indicates to you either that navigation is too hard or that you’re losing the interest of your users partway through.

You can also see which specific scenes people are interested in. So, for instance, if you’ve built a travel VR app where the user can initially choose between visiting Hawaii, Alaska, Maine, or California, you can understand which choice is the most popular. (side note: If you haven’t read our Tui Group case study, it’s worth your time!)

– Testing employees. With VR more and more being used for employee training, we get asked how you can incorporate and analyze questions into scenes? Through analytics! If you create questions using either Hotspot images or text — and you properly label your Hotspot file names — you can see exactly what answers were given to questions. We’ll have a full step-by-step guide for setting up questions, and collecting answers, on the site next week.

*header image courtesy of Dr. Dean Whitcombe, University of South Wales

2019-07-03T08:35:35+00:00 March 27th, 2019|General|