Qualcomm today announced new Wi-Fi 6E wireless chips. Designed for mobile devices, the company says the chips support “VR-class latency” for streaming VR over Wi-Fi.

Qualcomm’s new FastConnect 6900 and 6700 chips are designed to bring the latest Wi-Fi 6E standard to Snapdragon-based mobile devices like smartphones and standalone VR headsets. Wi-Fi 6E is an extension of Wi-Fi 6 (802.11ax) which can also tap into 6GHz bands, opening up more channels and bandwidth.

Image courtesy Qualcomm

With speeds up to 3.6 Gbps and “VR-class low latency” of “less than 3ms,” Qualcomm says its Wi-Fi 6E chips “provide a strong foundation for this rapidly growing industry segment.”

Of course, that 3ms latency only accounts for the amount of latency that Wi-Fi transmission adds to the equation—so that’s 3ms on top of whatever else your VR rendering pipeline looks like. But, all things equal, 3ms extra should be manageable.

Image courtesy Qualcomm

Qualcomm’s new Wi-Fi 6E chips could open two possibilities for VR streaming.

The most practical in the near-term is local streaming from your PC to your headset across your own, local network. The other possibility is remote streaming, where your VR content is rendered in the cloud and delivered over the internet to your router, and then to your headset. In this case there would be more latency on top due to the cloud-to-home transmission.

SEE ALSO
Qualcomm Reveals New Reference Designs for XR2-powered VR & AR Headsets

While it will be some time before the company’s FastConnect 6900 and 6700 chips make it into devices, it’s possible that they could end up in future standalone VR headsets—or smartphones powering VR viewers—which could allow users to stream VR to the headset locally or remotely.

Image courtesy Qualcomm

In the near-term, the Snapdragon-ready Wi-Fi 6E chips seem like they would be most applicable to future standalone headsets, like a potential Quest 2. Quest is already has official (wired) streaming from PC to Quest via Oculus Link, and the company has said that it would eventually like to make Oculus Link wireless.

Further down the road, if remote VR streaming becomes viable, standalone headsets could evolve into VR headset ‘thin clients’ which would be entirely reliant on cloud-rendered VR content. Such headsets could save on cost, weight, size, and battery life by using only the hardware necessary for receiving and decoding a VR stream that’s been rendered remotely.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Adrian Meredith

    When you consider oculus link only runs at 150mb/s theres a ton of room for improvements. However network transport latency is probably the fastest part of the whole pipeline; Its the encoding/decoding that really needs to improve. If Quest 2 could somehow get fast av1 or h265 decode then it might not matter. Even better would be some new proprietary codec that could run on the gpu as current codecs aren’t optimised for either VR or Latency

    • kontis

      Exactly.

    • mfx

      Definitely not av1 or h265, they have huge latency that suits them good enough for Skype but that’s it.

      The future will probably be the proprietary codec with a stable constant bitrate and a low compression/latency (as long as the bitrate is far under wifi6e specs, why using a complex codec like av1 then ?). A dedicated codec means that they could split the image signal with packages and tie this with eye tracking to boost the data only on the eye focus area. So sending huge definition images wouldn’t require a huge bandwidth like it would with a classic video codec.

      • Navhkrin

        You shouldnt AV1 as a singular algortihm with set bitrate / quality. AV1 comes in many forms, it has streaming optimized version that is very fast to decode but needs higher bandwith. (In fact, all codecs have such options that allow you to tweak time to encode / quality / bitrate settings)

        Better codecs will almost always translate to improvements as long as dedicated hardware is present.

        A dedicated codec means that they could split the image signal with packages and tie this with eye tracking to boost the data only on the eye focus area.
        You dont need dedicated codec for it at all. You can easily split image signal and encode it in chunks.

        https://youtu.be/o5sJX6VA34o

  • ShaneMcGrath

    When did 6E come out WTF?
    My WIFI 6 router isn’t even a year old, Somehow I missed the news on 6E.

    • dk

      u know how routers have 2.4ghz or 2.4 plus 5ghz dual band …..6E is still wifi6 ….but combined with 6ghz …earlier this year for the first time in something like 30 years there r new wavelengths allowed to be used

      • ShaneMcGrath

        Interesting, I never read anything about it on tech news sites i go to, Seemed to have missed the news on it on that particular day.

    • Andrew Jakobs

      There are already Wifi 6 routers? I missed out on that, LOL..

      • dk

        yeah well they r not that common for now

  • Ad

    I really don’t care about standalone and how that never delivers on its promises, but could this be used for adapters in the vein of the Vive Wireless? How much can we minimize the need for decoding data transferred? I also think by the time VR steaming is viable both budget PCs and mobile chips will be capable enough that it won’t be that necessary.

    • asdf

      the quest delivers….

    • kontis

      Thanks to the laws of thermodynamics the performance delta between stationary, bulky machine and a sleek, tiny chip on your head or in your pocket will ALWAYS be significant.

      So no matter how many breakthroughs in mobile space happen, a PC on your desk or a server rack will be able to do 20x more impressive stuff and people are always hungry for more.

      This means wireless transfer of data will be crucial even if some giant improvements happen in computation.

      • Ad

        “The fact that some of the most popular PC VR titles don’t need PC anymore and the experience is almost identical”

        What planet is this from? The Quest only gets basic ports of old games, or PC gets quest ports that really show how hollow quest gameplay can be.

        I was being generous with the mobile chip comment, basically that most people will want basic software in 5 years. I don’t think mobile competes at all or will for anything serious.

        • Jeff Axline

          What he meant was developers may target the low end and not put in the effort to make the PC version drastically better visually. Re read his whole post.

          • Ad

            That would be pretty bad considering the quest compromises everything, not just visuals. It’s really annoying that people think that quest games are just PC games with lighting and shadows removed. AI, physics, everything has to go.

          • My PCVR rig, overclocked 8086K on ROG Z390 and 2080Ti, using with Valve Index.

            Index has demonstrated a market exists for high-end PCVR, no different to enthusiast markets for audio hi-fi seperates, home cinema, DSLR, etc.

            As long as headsets and software are available…

            https://uploads.disquscdn.com/images/1161a7d7b4dff3cf9140cfe2109e1741833288be2cef9ef7ca0c633718c0fe77.jpg

    • Sven Viking

      A larger, heavier system with high cooling capacity connected to mains power is always going to be able to be faster than a wearable designed to be as light as possible and draw tiny amounts of battery power, so I think for the foreseeable future there’ll always be a high-end market wanting faster processing from a non-wearable system at times (whether or not that processing is cloud-based).

    • Andrew Jakobs

      This is exactly what makes it possible for wireless headsets to also be able to use PC hardware.. As I understand it even works pretty good with the Quest at the moment through the 3rd party VirtualDesktop app. I just hope Oculus would implement the Wifi streaming themselves in the OS so it might even perform better then the 3rd party option.

      • valkolton

        I was shocked I was able to play Half Life Alyx flawlessly with Virtual Desktop!!!

        Almost as good as my Vive Pro + Wireless, I can only imagine if this was baked in the OS and Wifi 6e, and wonder when 5g can do this cloud rendering

      • Ad

        I mean it doesn’t. It measurably does not. Latency is much higher than any PC headset and the pipeline for information is really low. That’s not acceptable and if it somehow is then why the heck did people keep demanding better headsets for PC if they were just going to use something so cut down and limited instead? Also this chip affects data transfer which isn’t the problem with the quest, read the featured comment.

  • kontis

    Of course, that 3ms latency only accounts for the amount of latency that Wi-Fi transmission adds to the equation—so that’s 3ms on top of whatever else your VR rendering pipeline looks like. But, all things equal, 3ms extra should be manageable.

    It’s not the VR rendering pipeline that is the problem here (because that is true also for wired), it’s the video compression pipeline that can take dozens of milliseconds, no matter how fast your wifi is. As long as the compression method is unchanged it also doesn’t matter much that you go from 8ms 150 mbit to 1ms 5 gbit – the latency will be almost identical (40ms+), what you will improve is bitrate, so the image will be sharper and with better color range.

    On the other hand large bandwidth at least creates opportunity for development of different compression methods, but without built-in encoder/decoder ASIC computational cost may also be a problem…

    50 GHz, like TPCAST or WiGiG or Wifi ay solve this problem with HDMI-like bandwidth and almost raw video.

    What Qualcomm actually offers is practically making currently known low cost streaming more reliable and with lower visual degradation. It won’t achieve tether-like latency.

    The other possibility is remote streaming, where your VR content is rendered in the cloud and delivered over the internet to your router, and then to your headset. In this case there would be more latency on top due to the cloud-to-home transmission.

    Nvidia experimented with light reprojection only method. Game rendered normally (natively), but the computationally heavy lighting (dynamic global illumination, like Lumen in Unreal Engine 5) would be streamed, which means the actual experience has zero additional latency. Also works for smartphones.

    Further down the road, if remote VR streaming becomes viable, standalone headsets could evolve into VR headset ‘thin clients’ which would be entirely reliant on cloud-rendered VR content.

    If that becomes the common standard it will be a computational dystopia. We won’t own anything.
    XR has potential to be the final platform, not just a mobile gadget like smartphones and tablets or gaming console. The closed platform duopoly we have in smartphones is nothing compared to the consequences of fully controlled XR future. Currently PCs are workstations and high perf machines. With reliance on cloud this may change.

    Facebook will be asking for $199 monthly virtual plastic surgery subscription (officially called “avatar customization” obviously) and people will be borrowing money to pay for it… In an open computationally platform everyone can be pretty for free (or you pay artist for their work if you want to be something else), in cloud-based locked platforms you have to pay the corporation for it. Why do you think FB invests so much in hyper realistic avatars and 4D scanning? They literally had tongue physics simulation job opening. How do you think they can make billions on something like tongue-lips physical collisions?

    Choose your future.

    • brandon9271

      “Nvidia experimented with light reprojection only method. Game rendered normally (natively), but the computationally heavy lighting (dynamic global illumination, like Lumen in Unreal Engine 5) would be streamed, which means the actual experience has zero additional latency”

      Do you remember what that tech was called. It sounds really interesting but I couldn’t find anything about it.

    • Kevin White

      Very interesting post especially toward the end on the dangers of cloud-everything in a dominant closed system. I have similar fears. I think “owning nothing” is where we’re headed in most things, and it might be preferred in some areas, but it also means “controlling nothing” and potentially being at the mercy of the tech gods.

      I’m pretty sure there was a book made into a movie about one company owning the entire digital metaverse (and streaming to ‘thin client’ headsets).

      • Ad

        I legitimately don’t know why that book is so popular with oculus people. Isn’t Mark practically identical to the bad guy in that?

    • benz145

      Thanks for this insightful comment.

      When I said “on top of whatever else your VR rendering pipeline looks like,” I meant (but I suppose failed) for this to mean the end-to-end pipeline, which in a streaming solution would include encoding and decoding. I just wanted to make the point that Qualcomm’s “3ms” is just the transmission part, so people didn’t think they’d be streaming VR content with 3ms of total latency.

    • valkolton

      Do you mean Tongue simulation or stimulation physics?!! Hahahah

    • GUEST

      I have the perfect name for it: Virtual Reality Minimal Latency (VRML), that is, history repeating itself over and over. Idiots!

    • Ad

      We need to choose our future and kick Facebook out. People need to refuse to work with them, refuse to republish their marketing, refuse to integrate with their systems, we need to actually give a damn about ethics. Look at Free Basics, that’s the future they want for everyone.

    • Mradr

      I brought this up in the forums as well. A always connected device will be shy away from by most people still. While devices that still offer a middle ground like mobile Quest will be able to offer a multi-connection device with a wider range of value and control. While this sounds amazing on paper – it will take time for 5G and WiFi 6 to make it out on the market thus a small percent will move to wireless – but the majority will still most likely be on a cable or short throw from another device local to the user. Even still, there are many walls even today games face on streaming services that will be even worst for VR. Thus, these devices will have a very limited streaming ability. For example, 4k still comes at the cost of compression/artifices, higher cost, and higher bandwidth. Resolution for VR increase very quickly compare to flat screen gaming as well with 2 render views.

    • DevilBlackDeath

      And that’s exactly why I buy my music on Bandcamp. Cause at least I do own something. Sure it’s literally digital bits, but I do own them in that I can download them and reproduce them in any way shape or form I wish and put it on completely disconnected devices ! Though to be fair cloud-only or streaming-only music solutions are not nearly as bad as what a completely closed cloud-only VR and PC future would be :S

  • Sven Viking

    Wow, I was just talking about something like this exactly.

  • Foreign Devil

    Great! Hope these chips didin’t get released too late for Quest 2.

  • PJ

    HUGE news