Lytro, a leading light-field company, is positioning its light-field capture and playback technology as the ideal format for immersive content. The company is building a toolset for capturing, rendering, and intermingling both synthetic and live-action light-field experiences which can then be delivered at the highest quality playback supported by each individual platform.

Speaking with Lytro CEO Jason Rosenthal at the company’s Silicon Valley office, I got the rundown of how the company aims to deploy its tech toolset to create what he calls the Lytro Reality Experience, immersive experiences stored as light-fields and then delivered for the specific quality and performance capabilities of each consumption end point—all the way from the highest-end VR headset at a VR arcade down to mobile 360 footage viewed through a smartphone.

Light-fields are pre-computed scenes which can recreate the view from any point within the captured or rendered volume. In short, that means that a light-field can be played back as scenes which exceed the graphical capabilities of real-time rendering while still retaining immersive 6DOF positional tracking and (to an extent) interactivity. Though not without its own challenges, light-fields aim to combine the best of real-time immersion with the visual quality of pre-rendered VR experiences.

Rosenthal made the point that revolutions in media require new content formats with new capabilities. He pointed to the PDF, OpenGL, http, and MPEG as examples of media formats which have drastically altered the way we make and consume information. Immersive media, Rosenthal says, requires a volumetric format.

To that end, Lytro has been building a complete pipeline for light-fields, including capture/rendering of light-field content, mastering, delivery, and playback. He says that the benefit of this approach is that creators can capture/render and master their content once, and then distribute to headsets and platforms of varying capabilities without having to recapture, recreate, or remaster the content for each platform, as presently needs to be done for most real-time content spanning desktop and mobile VR headsets.

SEE ALSO
Exclusive: Lytro Reveals Immerge 2.0 Light-field Camera with Improved Quality, Faster Captures

There’s three main pieces of Lytro’s toolset that makes it all possible. First is the company’s light-field camera, Immerge, which enables high-quality live-action light-field capture; we recently detailed its latest advancements here. Then there’s the company’s Volume Tracer software which renders synthetic light-fields from CG content. And finally there’s the company’s playback software which aims to enable the highest-fidelity playback on each device.

Image courtesy Lytro

For example, a creator could create a high-fidelity CGI scene like One Morning—a Lytro Reality Experience which the company recently revealed—with their favorite industry-standard rendering and animation tools, and then output that experience as a Lytro Reality Experience which can be deployed across high-end, low-end, and even 360 video without needing to modify the source content for the specific capabilities of each device, and without giving up the graphical quality of raytraced, pre-rendered content.

Lytro is keeping its tools close to the chest for now; the company is working one on one with select customers to release more Lytro Reality Experiences, and encourages anyone interested to get in touch.

An example of an incredibly detailed Lytro Reality Experience. The company says that high fidelity light-field scenes like this will be able to seamlessly merge with real-time interactive content. | Image courtesy Lytro

I’ve seen a number of the company’s latest Lytro Reality Experiences as played back along the spectrum of devices (like Hallelujah), from high-end desktops suitable only for out-of-home VR arcades, all the way down to 360 playback on an iPad. The idea is to maximize the fidelity and experience to the greatest degree that each device can support. On the high-end desktop, as seen through a VR headset, that means maximum quality imagery generated on the fly from the light-field dataset with 6DOF tracking. For less capable computers or mobile headsets, the same scene would be represented as baked-down 3D geometry, while mobile devices would get a high quality 360 video rendered at up to 10K resolution—all using the same pre-rendered source assets.

– – — – –

The appeal of the light-field approach is certainly clear, especially for creators seeking to make narrative experiences that go above and beyond what’s possible to be rendered in real-time, even with top of the line hardware.

Since light-fields are pre-rendered however, they can’t be interactive in the same way that traditional real-time rendering can be. Rosenthal acknowledges the limitation and says that Lytro is soon to debut integrations with leading game engines which will make it easy to mix and match light-field and real-time content in a single experience—a capability which opens up some very interesting possibilities for the future of VR content.

SEE ALSO
Avegant Claims Newly Announced Display Tech is "a new method to create light fields"

For all of the interesting potential of light-fields, one persistent hurdle has hampered their adoption: file sizes. Even small light-field scenes can constitute huge amounts of data, so much that it becomes challenging to deliver experiences to customers without resorting to massive static file downloads. Lytro is well aware of the challenge and has been aggressively working on solutions to reduce data sizes. While Rosenthal says the company is still working on reining in light-field file sizes, the company provided Road to VR with a fresh look at the their current data envelopes for each consumption end-point:

  • 0.5TB/minute for 6DoF in-venue
  • 2.7GB/minute for in-home desktop
  • 2.5GB/minute for tablet/mobile devices
  • 9.8MB/minute for 360 omnistereo

The above is all based on the company’s Hallelujah experience, as optimized for each consumption end-point. Think these numbers are scary? They were much higher not that long ago. Lytro has also teased “interesting work” still in development which it claims will reduce the above figures by some 75%.

Despite Lytro’s vision and growing toolset, and an ongoing effort to battle file sizes into submission, there’s still no publicly available demo of their technology that can be seen through your own headset. Fortunately the company expects that the first public Lytro Reality Experience from one of their partners will launch to the public starting in Q1 2018.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Surykaty

    That’s a Lytro lightfield? Why does it look completely rendered (I thought they were mostly into live capture)? Anyway both Lytro and Otoy do not really want to release the lightfield tools for the devs nor users right now.

    I hoped Otoy would at least release an utility that would create a lightfield from an image sequence that was captured by a tripod rig they showed with a spinning arm with 2 cameras on it while explaining the technicalities behind he rig. I’ve got everything needed to create such an rig and would love to create my own real life captured lightfields. I guess the Porn industry would love that too.

    • benz145

      Lytro makes a camera to capture live action light-fields (called Immerge) and also makes a piece of software to render light-fields from pre-rendered CG content (called Volume Tracer).

      As for Otoy’s capture of live-action light-fields with the spinning cameras; they seem to be moving toward optimizing their live-action light-field tech for 60 cameras—they announced some work involving the Facebook Surround60 cameras earlier this year:

      https://www.roadtovr.com/facebook-unveils-two-new-volumetric-video-surround360-cameras-coming-later-year/

      • Surykaty

        Lytro: Actually they seem to make 2 types of cameras. Cinema and Immerge. It’s funny I can remember Immerge should have been an interesting segmented sphere that was a few inches larger than a basketball.. now it’s a monstrous looking hexagon…. Lytro seems to be never able to deliver (the blunder with their artifact ridden cameras) .. failure seems to be connected to them. Poor investors…

        Otoy was some time ago stating that they will try to commercialize the spinning camera rig (10 000 or so dollars). Looks like there was a hitch in the plan somewhere.

        • chuan_l

          Those earlier ” Immerge ” renders —
          Do not correspond to an actual physical camera and none exist in that form. Rather they were put out there to generate interest from filmmakers and investors. They have always used a planar array for capture , nothing else.

        • Ian Shook

          I think Lytro simply changed their design – probably to be able to get higher resolution and larger 6dof ‘view sphere’ or whatever we’re going to call it. Personally I can’t wait for their products and software to be released.

          • Nate Vander Plas

            And that’s like saying Apple just “changed the design” of the iPhone. It now looks like this XD
            https://uploads.disquscdn.com/images/f65c7f7b203fa797c5f360775a5437076f042d9c5317010e979f7149c6942f95.jpg

          • Ian Shook

            That phone was dope and you know it.

          • Surykaty

            Oh just simply changed the design….

            Wow you’re so easy to please. I bet you’d be okay if your wife went to the theater with her male colleague for the 20th time and said “oh c’mon Ian.. he’s just a friend”.

            Lytro overpromises and absurdly underdelivers. What I find absolutely crazy is that the Immerge is not 360 degree capture anymore.. and for positional tracking they obviously have to rely on trying to create a depth map from the 2d data.. seeing how innacurate those depth maps are you can’t even simulate proper DOF blurring – so much artifacts get in the way (Project Tango got dumped and this was one of the reasons.. the innacuracy of 3d depth sensing) .

            The fact that they show a rendered lightfield and this is a clear sign that Lytro failed and they’re very desperate.. the Immerge and Cinema are not going anywhere.. it’s flawed technology right from the beginning.

      • Nate Vander Plas

        Actually there is no 60-camera Facebook rig. It is the Surround360, not the “Surround60,” and there are two options: a 24-camera (x24) and a 6-camera (x6) rig. I believe only the x24 has 6DOF capability. These are probably pretty inferior to light fields, since the footage is basically projected onto 3D geometry via a depth map which is generated by analyzing pixels between overlapping cameras. Would still be a big improvement from regular 360 video.

    • chuan_l

      Yeah not sure what’s going on there —
      Those incidental rays on the armchair , television and other shiny surfaces seem to remain static and ” baked in ” even as the viewer moves position ? Needs a better example to show the benefits of light fields methinks.

  • mike

    awesome job looks amazing, keep up the excellent work

  • Ian Shook

    I desperately want to try their Volume tracer – I just don’t understand how its going to work with existing various software and render engines. I’d imagine you place your lytro box object in the scene, but I’m not sure how it would talk to existing cameras, or if it has its own. And if it’s it’s own camera – how does it work with your render engine of choice?

    • Mary

      Google is paying 97$ per hour,with weekly payouts.You can also avail this.
      On tuesday I got a great New Land Rover Range Rover from having earned $11752 this last four weeks..with-out any doubt it’s the most-comfortable job I have ever done .. It sounds unbelievable but you wont forgive yourself if you don’t check it
      !da243d:
      ➽➽
      ➽➽;➽➽ http://GoogleDailyConsumerServiceJournalsJobsReport1/easy/jobs ★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫:::::!da243l

  • I want to try one of their experiences

  • August

    Heres a deeper discussion on the Lytro rig and capability with Lytro’s Exec. Director of Content Partnerships, Steven Swanson…

    https://www.youtube.com/watch?v=k9F79sA9Rww&t=1s