Limitless, a company developing content and tools for creating cinematic VR experiences, has joined Lytro to build out tools for combining light-fields and real-time rendering directly in game engines.

As we noted recently, Lytro is positioning its light-field tech as VR’s master capture format. As the company is building pipelines—like Immerge and Volume Tracer—for capturing live action light-fields and generating entirely synthetic ones, the natural next step is for the company to allow easy combinations of the two, as well as real-time computer graphics.

In taking that next step, Lytro has welcomed the team from Limitless into the company. Limitless was behind narrative VR content like Reaping Rewards, which was built on a toolset designed to make it possible to animate on the fly, directly inside of VR. Now part of Lytro, the Limitless team is helping to build out the company’s game engine toolset, which Lytro says will allow users to seamlessly blend light-fields with real-time rendered content.

SEE ALSO
Exclusive: Lytro Reveals Immerge 2.0 Light-field Camera with Improved Quality, Faster Captures

With integrations in the works for both Unity and Unreal Engine, Lytro’s goal is to make it easy for their customers to leverage the advantages of light-fields without giving up the advantages that come with real-time rendered content—namely, interactivity.

Light-fields are capable of capturing high-quality real-time volumetric video, or high-quality pre-rendered CGI visuals that go far beyond what can be rendered in real-time. The downside is that, because light-fields are pre-captured or pre-rendered, they can’t change in response to input, which means they can’t support the level of interactivity that real-time rendered content can—like, for instance, throwing a ball in an arbitrary direction and having it bounce off the floor, or rendering a character which can react to the user’s actions. That is to say—light-fields can work great for some things, but not everything.

Lytro wants to eliminate the need to choose between the quality of light-fields and the interactivity of real-time rendering by letting developers use the two interchangeably in a single project. In a recent meeting with Lytro, I got to see this in action—the company pulled their Hallelujah light-field into Unity as a point-cloud, and proceeded to modify the look of the scene using controls directly inside of Unity.

Beyond just playing with the color and lightning of the light-field scene, they showed how real-time elements could interact directly with the scene by throwing a bunch of beach balls around and adding real-time fog. They also showed simple but practical uses of working with a light-field in a game engine, like being able to easily composite text directly in the environment, mask out portions of the light-field scene, edit the scene’s playback, and combine multiple light-field scenes together.

While this is certainly a boon for VR storytellers already used to building experiences in game engines, these new game engine integrations seem sure to pique the curiosity of VR game developers too, who could find novel ways to combine the best of both the light-field and real-time to create an experience with both ultra high fidelity and immersive interactivity.

Lytro is still only making their tools available to those collaborating with the company directly, but if you’ve got an interesting project idea they encourage creators to get in touch.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • VRgameDevGirl

    This. Looks. Amazing.

  • psuedonymous

    Composite real-time 3DCG characters into a pre-rendered lightfield ‘background’. FFVII/VIII/IX for VR!

    • Kyra

      Gℴℴgle giving to people of every age 98 US dollars hourly to complete some work with a computer .. Work Some just few hours daily & have more time together with your own friends … Anyone can benefit this best post…last Monday I purchased a brand new Saab 99 Turbo after just earning $12458 this five weeks .it looks the easiest-work however you will no longer forgive yourself if you don’t take a look at it.!sx86o:⇆⇆⇆ http://GoogleCashCareerNewWorkFromHome/find/jobs ♥♥♥q♥♥n♥♥♥u♥o♥l♥♥♥z♥♥h♥j♥u♥♥♥h♥f♥♥k♥♥n♥♥♥a♥r♥x♥♥♥u♥♥♥x♥r♥f♥o♥s♥y♥♥♥y♥♥♥w:::::!ew02u:chg

    • Jistuce

      Also Resident Evil! Resident Evil 2 too!

  • Lucidfeuer

    So, as for otoy ORBX Lightfield, where are the actual demos and tools? What is their plan, or what are the actual (hardware or software) limitations that prevents these to be actual usable tools, which is pretty much the only valid reason why it’d be not ready yet advertised as such.