Google’s Project Starline is a Light-field Display System for Immersive Video Calls

13

This week Google revealed Project Starline, a booth-sized experimental system for immersive video chatting, purportedly using a bevy of sensors, a light-field display, spatial audio, and novel compression to make the whole experience possible over the web.

This week during Google I/O, the company revealed an experimental immersive video chatting system it calls Project Starline. Functionally, it’s a large booth with a big screen which displays another person on the other end of the line at life-sized scale and volumetrically.

Image courtesy Google

The idea is to make the tech seamless enough that it really just looks like you’re seeing someone else sitting a few feet away from you. Though you might imagine the project was inspired by the pandemic, the company says the project has been “years in the making.”

Google isn’t talking much about the tech that makes it all work (the phrase “custom built hardware” has been thrown around), but we can infer what a system like this would require:

  • An immersive display, speakers, and microphone
  • Depth & RGB sensors capable of capturing roughly 180° of the subject
  • Algorithms to fuse the data from multiple sensors into a real-time 3D model of the subject

Google also says that novel data compression and streaming algorithms are an essential part of the system. The company claims that the raw data is “gigabits per second,” and that the compression cuts that down by a factor of 100. According to a preview of Project Starline by Wired, the networking is built atop WebRTC, a popular open-source project for adding real-time communication components to web applications.

As for the display, Google claims it has built a “breakthrough light-field display” for Project Starline. Indeed, from the footage provided, it’s a remarkably high resolution recreation; it isn’t perfect (you can see artifacts here and there), but it’s definitely impressive, especially for real-time.

Granted, it isn’t yet clear exactly how the display works, or whether it fits the genuine definition of a light-field display (which can support both vergence and accommodation), or if Google means something else, like a 3D display showing volumetric content based on eye-tracking input. Hopefully we’ll get more info eventually.

Once hint about how the display works comes from the Wired preview of Project Starline, in which reporter Lauren Goode notes that, “[…] some of the surreality faded each time I shifted in my seat. Move to the side just a few inches and the illusion of volume disappears. Suddenly you’re looking at a 2D version of your video chat partner again […].” This suggests the display has a relatively small eye-box (meaning the view is only correct if your eyes are inside a specific area), which is likely a result of the particular display tech being employed. One guess is that the tech is similar to the Looking Glass displays, but Google has traded eye-box size in favor of resolution.

Image courtesy Google

From the info Google has put out so far, the company indicates Project Starline is early and far from productization. But the company plans to continue experimenting with the system and says it will pilot the tech in select large enterprises later this year.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Google have been working on Starline for 5 years; X Development (Google skunkworks) always has some new immersive technology brewing.

    Initially it’s being rolled out to corporate clients in selected offices, “We’re planning trial deployments with enterprise partners later this year.”

  • I bet this will never take foot and Google will abandon it

    • Debbie Rios

      Lyla previous six weeks l earned $19562 by working on my computer staying at my home in my free hours.(r1328) I’m able how to do it by working few time in whole day with my laptop. Realy it’s very easy & everyone definetly can join this work. >>> http://b2x.ir/74de623b

    • Lucidfeuer

      That’s an oxymoron, you could have just said “Google”. But more seriously this is just tech money laundering, this is not a real project, or research, let alone product.

  • Nicholas

    How can this company still have money?

    • Lucidfeuer

      precisely with these kinds of fake project, to maintain artificial stock valuation and evade taxes.

  • brandon9271

    i think it’s some sort of head/eye tracking on the viewing end and a camera array doing a type of real time photogrammetry or a glorified Xbox Kinect to capture depth.. I’m honestly not impressed.. yet.

    • Michael Balzer

      Pretty sure it is a RGBD capturing device at 60 fps. They may use three units to provide 270 degrees and a fusion process to integrate the three streams with an off the shelf capture process. In fact Jasper Brekel has been doing this for two to three years, and I have had the opportunity to beta test it with three Kinect v. 2 for motion capture. Of course the newer Azure Kinect would be an even better choice.

      The real trick is to create the illusion of depth without glasses, which leads me to believe this is an offshoot of another project that uses head/eye tracking cameras to adjust the image scene based on the observers head vector. I also believe a lenticular lens is used to provide the stereo paralex effect needed to give the display depth. As mentioned, all of this has been shown (CES) or even available for purchase (Brekel). In fact their was Chinese company who had created a tuned lenticular screen for the iPhone 7/8 & 10 screens as well as two Samsung screens. Once calibrated, the effect was quite convincing to watch stereo paired images, video even 180 & 360s without glasses albeit on a tiny screen. This company also makes larger lenses for monitors, but nothing this size. This is where Google’s deep pockets could have created a very large lens or justed paired with one of major manufacturers who was showing it off at CES in the heyday of “3D” TV. That tells me this system is at least $20K, if not closer to $50K.

      • Michael Balzer

        This may also account for the “sweet spot” since these systems have limited range of lateral head movement. It may also have limited forward head movement as well.

  • Rupert Jung

    If they were only capable of correcting the gaze offset between camera and the eyes of the conversation partner with AI – this would already been fantastic with normal webcams.

  • Webdesk ERP

    This week google found out challenge starline, a sales space-sized experimental gadget for immersive video chatting, purportedly the use of a bevy of sensors, a mild-field show, spatial audio, and novel compression to make the Best Approval Software in Dubai complete experience possible over the internet.

  • markussch

    This blog is what I was looking for. This piece of content will really help me. Thanks for sharing it. visit my site: baby milk powder suppliers

  • John william

    I really love this blog
    johnwilliam