minimetaverse
A lightfield scene rendered with OTOY’s tools; it retains all 3D data and can be move about in real time even at pre-rendered CGI quality

Jules Urbach, CEO of OTOY, believes lightfields are the endgame of virtual reality. Essentially a way to digitally record all light reflected from a real or virtual object, lightfields seemingly enable feats of technological magic, from being able to view near-photoreal quality scenes from any angle within the capture volume to actually being able to dynamically relight actors in real time. OTOY believes that lightfields solve many of VR’s rendering challenges and has been investing itself in them heavily. Road to VR guest reporter Nate Kozak spoke with Urbach about the company’s new lightfield streaming.

SEE ALSO
Exclusive: OTOY Shows Us Live-captured Light Fields in the HTC Vive

Of course, lightfields bring with them a host of challenges. Chief among them is simply file sizes; gathering and storing detailed lighting information about even one captured actor or object involves enough data that more complex scenes quickly become unrealistic to deliver across most residential connections. A few gigabytes for a movie is manageable; a few terabytes would not be. Had holographic data storage ever filtered down to the consumer level, we might be buying lightfields from catalogs, complete with Ziploc packaging, but today data consumption is increasingly across wireless links, a step in the wrong direction for a technology that loves to suck up storage space.

Luckily, OTOY has been working hard on this problem, and says they’ve finally cracked it. Sophisticated compression algorithms manage to tame lightfields, and clever tricks “sort of like foveated rendering” allow OTOY’s lightfield player to only request sections it predicts the viewer might look at next, the company says. The proof of concept culminates, stunningly, in a transmission rate of 1.5Mb/s for a single actor lightfield, far less than watching a Netflix stream, and easily streamable to a smartphone.

[gfycat data_id=”GrossHarshFossa” data_autoplay=true data_controls=false]

The demo itself was not as flashy as the worlds of opportunity it could represent. Running in AR form on a smartphone with a fiducial marker for tracking, a human actor dressed up as Green Goblin (from Spiderman) was seen on the screen. At any instant, the image of his psychotically grinning face was as clear as a high-quality real life photograph, teeth glistening and hints of transluminescence made it clear this was no mere 3D asset.

Even at distances threatening to clip into the capture volume, every detail was retained, the definition of photorealistic. Photos, unfortunately, don’t allow their viewers to reframe the image and look at the back of a head in a portrait, but lightfields do. Passers-by surely assumed this was just someone waving an expertly crafted 3D model around, because many have not grasped what makes lightfields different, much less what they could mean for the future of immersive media.

SEE ALSO
First Look: 'Batman' Gear VR Experience Featuring OTOY's Incredible 18k Cubemap Video Tech

Everyone may start to get an inkling very soon though as OTOY tweeted just prior to the start of Connect that lightfield streaming will come to Gear VR in the next ORBX Media Player app update.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • psuedonymous

    “Photos, unfortunately, don’t allow their viewers to reframe the image and look at the back of a head in a portrait, but lightfields do. ”

    “Enhance 224 to 176. Enhance, stop. Move in, stop. Pull out, track right, stop. Center in, pull back. Stop. Track 45 right. Stop. Center and stop. Enhance 34 to 36. Pan right and pull back. Stop. Enhance 34 to 46. Pull back. Wait a minute, go right, stop. Enhance 57 to 19. Track 45 left. Stop. Enhance 15 to 23. Give me a hard copy right there.”

    • Steve P

      Ha ha Blade Runner has arrived!!

  • OgreTactics

    Oh yeah, that vaporware from those Otoy loser who patent troll and buy companies only for nobody to be able to use it.

    • bobbysaysboo

      care to elaborate on that ?

      • OgreTactics

        First of all, their licensing system sucks and is obsolete, and they only maintain their position because they are the leaders in rendering for professionals now, but none of the huge upcoming mass of 3D amateurs, students, converted agency or freelance professional can buy a licence at that ludicrous price.

        Their student subscription system is insulting, like money hungry leeches they only give access to the v1.2 when Octane is at version 3 and of course gives access to none of the plugins, and unlike about every other engine/renderer they don’t even have a fucking subscription plan in 2017.

        Hence, their “lightfield”, OBX, Virtual bullshit technology is used by NO-ONE because nobody has actually access to it, and while Brigade has been around for years and about to be released until Otoy bought it, nobody has seen the color of it.

        • Jules Urbach

          Subs are already available for $20 month: 2x GPUs + the plug-in of your choice. We haven’t put this out on the shop page yet, just soft launched this week. Email help@otoy.com if you want to sign up now . The subs will roll over to V4 when it comes out. Brigade is going into Octane 4.

          • OgreTactics

            At last, great news, thanks for responding. I can’t stand companies with great techs NOT making them usable to actual users. I’m glad to hear there’s a first subscription plan coming-up (hopefully a subscribe-to-own scheme). But what about Light-field and ORBX? Who uses them (and by that I don’t mean 3 partners but the wealth of users), how to access and use these? I’ve been trying to to get my hands on it because obviously it’s the only way to have pre-rendered VR scenes with head-tracking beside volumetric videos.

          • Jules Urbach

            Any stereo cube map rendered in Octane VR (which we made free last year for a few months) or regular Octane for the Oculus 360 photos content, can be converted into 1 m cube volume LF on ORC. The URL gives you a SCM png/jpg/exr poster frame, then when the LF loads on the server, it can be streamed back to ORBX Media Player on Gear VR (and soon other devices) with pos tracking. This is similar to how we live stream the viewports in the Oculus Social rooms that just launched (Disney, ABC, Marvel, ESPN etc.)

          • OgreTactics

            I understand. In fact since Render the Metaverse I’m almost ONLY interested in stereocube maps which some people dub as “screenshots” of the future. I didn’t know they could be converted to LF then streamed back to the ORBX player, which last time I checked couldn’t access any datas or even the base SCM that were available in the first version. I don’t understand how you can do pos/headtracking on the GearVR either, but that should work on Oculus Rift. Anyway do you have any tentative date for the release of Octane 4 subscription plan? And if I may as a bonus question: while viewports open the possibility to superimpose real-time 3D or animated video components on a SCM, wouldn’t having a viewfinder stereo video buffer (that only buffers and plays the part of the video map the user is looking rather than the whole think like most 360° video player do today, thus limiting resolution and size) be a better short-term solution that lightfields in ORBX?