cloud xr 5g rendering

NVIDIA’s Greg Jones: The 5G hype is justified, it’s going to be a huge advantage for XR

I’ve had the enormous pleasure of having a chat with Gregory Jones, that works as Senior Manager of Global Business Development and XR at NVIDIA. He’s the man managing the CloudXR SDK, the solution that lets you have cloud rendering for virtual and augmented reality, so he has a great vision on how and when we can have an XR headset that offloads everything to the cloud.

I spoke with him about AR, VR, 5G (on which he’s very confident), cloud, and all the technologies we love the most. We’ve also talked about the latencies that we can expect today from cloud rendering. It’s been a very informative talk, and as always, I provide it to you both as a video and as a textual transcript (with the time tag so that you can listen to Greg’s answer only to the questions that you are most interesting into).

Have a nice reading/watching!

Please, first of all, introduce yourself: What do you do in your daily life at NVIDIA? (0:35)

Yeah, so, I’m Greg Jones and I my overarching role is the global manager for business development for XR and then, within that role, I’m the cloud XR product manager. I’ve been with Nvidia for a little over two and a half years and I came out of a research group at the University of Utah were I managed a 200-person research group that focused on high performance computing simulation and most importantly visualization of high-end scientific data.

It was that focus on visualization we’ve been trying for 20 years… we’ve been looking at how people interact with their data, how you can start having a human data interaction instead of a human-computer interaction and that just really marries well with the idea of XR so that’s what brought me to Nvidia.

I know you work on the CloudXR solution. Can you explain better what it is? (1:40)

Cloud XR is an early product: we launched this as an early access program back in October and we’ve just released our 1.0 version a few weeks ago. CloudXR is our first library to help stream XR from remote servers and really, even though CloudXR is a brand-new product… if you think about our work with streaming… the GeForce Now team has been working on streaming games for you know 5 or 6 years now… and we’ve taken a lot of that expertise and embedded it in the cloud XR SDK. So it’s a new product with a long legacy of expertise in streaming graphics across the internet, the networks and such.

Does this solution also include the server part, a preconfigured server configuration? (2:55)

The SDK is just literally the SDK and what it does, what the exciting part of it is, it’s an OpenVR interface, so any OpenVR application that you have will automatically work with CloudXR without any alteration. And with just a little bit of alteration, any OpenVR application can also stream AR, and what I mean by little alteration is that you just need to be able to expose an alpha channel.

So it’s a really nice SDK: if you have a server and you’re running an OpenVR application, we create a SteamVR driver that the OpenVR application just sees as a normal SteamVR driver. The application thinks it’s handing a frame to an HMD, instead we take that frame and our server driver encodes that frame and transports it.

Then, on the device, on the client side, we have another instantiation of our derivative of SteamVR that accepts that frame and then basically hands it to the HMD and so it’s a really straightforward library and the application providers, the ISPs don’t have to do anything to their applications to use CloudXR.

Many people are skeptical about cloud XR streaming. What are the performances that we can have now with the CloudXR SDK? (4:38)

I use it on my home network: I have basically a workstation and I’ve streamed to an Oculus Quest or a Vive Focus Plus, or a Vive Pro and I use the Windows client for the Vive Pro and of course the Android clients for the Oculus and the Focus Plus, and it’s just like being in native XR.

Me playing Half-Life: alyx with the Vive Focus Plus via Wi-fi streaming. Playing it tether-free was an amazing experience

The keys to having a good experience, of course, are frame rate and latency, and we manage the frame rate within the CloudXR package to match the one of the end device we’re streaming to. The latency… that is the big question, right? There’s a combination of things that create that latency: one is the render time, obviously, and then when you’re going to stream, you have to encode and decode and go across the network, so you add a bunch of latency.

Everything to reduce that latency is important: having a really fast network and also a relatively robust network. The way we manage for instance jitter on our CloudXR is: we buffer against the jitter, but creating a buffer induces latency, so if it’s a really noisy network you’re gonna increase the latency. And if you have a really slow network, so your ping time is a lot of milliseconds, that may be too much latency.

Part of it is getting a system with low latency and that requires a pretty fast network and a pretty noise-free network. Then the real question is how to manage all that latency because you don’t have a lot of milliseconds before you induce the VR sickness, that kind of stimulus system confusion that creates the motion sickness. The key is using a combination of technologies like late latch or synchronous timewarp, those types of strategies that manage that latency or hide that latency. That’s really the key of Cloud XR, managing for you all the factors: your compression, your encode-decode, your jitter buffer, all those things together… managing them kind of holistically or globally, to give a nice smooth frame rate on the device.

We think that using our streaming technology, plus the way we’ve built CloudXR using our Steamvr driver and Steamvr client, we’ve managed to balance all those competitive features into a really nice streaming solution and it works better on lower latency networks and 5G.

If you think of 5G, people think of just the edge on 5G often times, so that’s the edge and the compute will be, you know, less than 10 milliseconds away and that makes for a really low latency network. But actually 5G will also have better management of the streams to the users, it will be a really big deal for this… but also 5g will have an overall better network.

So not just the edge network will be better, but the overall network will be better, and have lower latency. Right now, we build a frame on  the server and then we transport that frame, but eventually, you will have different compute going on with XR workloads… like, if I’m doing AR, a lot of what I’m going to do is contextualize the data that I’m seeing in front of me and I’m going to have some kind of inference happening somewhere in the network and contextualizing and putting more information about what I’m seeing into my view. That information doesn’t necessarily need to have a latency of twenty milliseconds, those could be maybe a hundred milliseconds, so that can happen somewhere else in that network. What’s really cool about the whole 5G thing is that all these networks and how they’re orchestrated will give us different compute at different latencies that will fit different parts of the XR environment we’re creating.

The Cloud XR right now is just giving a great XR experience by managing latency, managing bandwidth, and all those things will be useful to create a global solution […] The 5G hype is justified, it’s gonna be a huge advantage for XR.

There are always lots of debates about 5G, if it is useful or not. Personally, I think it’s great, but how do you envision 5G being useful for XR? (10:10)

If I’m doing Cloud XR over Wi-Fi, for instance, I do it in my home, and I make sure there are not too many devices on and make sure my router is closed… and all that makes an environment where I have a pretty low-jitter network. My latencies are short and Cloud XR streams VR in a great way, so that’s a great solution.

If I’m in an environment where there are more users ,like, let’s say… someone wants to run a manufacturing floor with a hundred people wearing AR devices connected through Wi-Fi 5, let’s say, I’m gonna have a lot of interference in that network, I’m gonna have a lot of jitter. And that jitter, the way you’re going to have to manage that jitter, is with buffering, that’s going to take latency, and so you’re going to induce more latency in the signal because you have a network with a lot of interference.

Now if you think of Wi-Fi 6, and you think of 5G, both with their new beamforming, their multiplexing type technologies.. you’re gonna see a better signal management to each user and that management, plus the low latency, plus the bandwidth will be what makes 5G really useful for XR, so yeah I think it’s going to make a big difference on how those signals are managed from the cell, to the person, to the device.

5g network xr
5G will make all our lives more connected
Some people say that since Wi-fi 6 is coming, we don’t need 5G. How do you answer to this? (12:08)

I think both are going to be important and we’ll start seeing use cases, and it’s going to come down to empirical data and what works best. I think both will be important: part of the magic of 5G is the whole telco network that will be behind the 5G signal, it will be a really enabling network but yeah… I can see really nice use cases where you get 5G to your homes so you have access to that whole network and then in the home you’re running Wi-Fi 6… and that’s where you’ll be streaming XR from your home device to your tetherless headset. I think the combination of Wi-Fi 6 and 5G is going to be extremely important.

You can also think of cable companies that are already delivering high bandwidth and low latency networks to the homes, and that hook to the Wi-Fi 6 will be in important. Welcome all the different frequencies to the party. Just being able to use the entire network is the promise of 5G and an in-home and on-prem fast connection Wi-Fi 6 letting that work in specific locations. That combination will be really astounding.

[…]

The demo we ran at Mobile World Congress in October… we had a server in the Verizon data center in LA and then we streamed a McLaren Senna rendered in Autodesk VRED to a 5G cellphone, and did a nice AR demo. Jensen actually showed this in his keynote at Mobile World Congress.  That demo worked flawlessly and I think we had a network latency of five milliseconds and that was from an LA datacenter down to the floor of a convention center, big noisy convention center, on a cellphone and on a production 5G network and that demo was gorgeous. So that’s an indicator of what’s coming with 5G and I’m really excited about it.

Returning to the CloudXR SDK, what is the overall latency that you obtain in your lab settings? (14:48)

If you think about it, a lot of it depends on the render time. So let’s say we have a 20 millisecond render time, and you’re gonna add anywhere from 3 to 7 milliseconds for encode, same thing for decode, so then you’re up to let’s say maximum you’re up to 34 milliseconds. If the network adds 10, you’re up to 44 milliseconds, and then if the network has some jitter, you have to add some buffering time. So let’s say you have 60 milliseconds of latency and, you know, if I have a latency of 60 milliseconds and I hand that frame to the headset, the headset asks me for a warp of the frame to match its current position… and if I warp that frame on the client-side GPU, then I I’ve hidden that 60 milliseconds of latency and I have in essence essentially a 3 millisecond image in front of my eyes.

So let’s say 60 milliseconds is where we sit with a relatively heavy render, and that was with 10 milliseconds of ping time on your network and a really quiet network, so I’m not gonna add any jitter, so take 10 milliseconds. If you have a 20-milliseconds network, then you’re gonna have 70 milliseconds of latency, if you have a 30-milliseconds network, you’re gonna have 80 milliseconds of latency… and then in that space once you start getting high enough, it’s really depending on the app.

You can imagine that if you’re playing a game with a lot of hand motion and I’m using late latch or asynchronous time warp to hide my latency for the visuals, my hands are gonna start lagging, so a professional gamer is going to feel a lag, but a gamer like myself or an industrial-use-case person won’t even notice the difference between visual and handling. So you get into cases of “should we start rendering the hands on the device so there’s no lag in that?” and that type of stuff, but for your question you were asking, 60 milliseconds 50-60 milliseconds seems like the minimum latency we have in the whole system right now if you have a pretty significant render in your application.

Since I’m an XR developer and consultant myself, I want to ask you: how can we try this CloudXR SDK and what is the price? (17:39)

The CloudXR SDK is free, it’s an SDK and like most of our SDKs is free of charge. We’ve got to a gated release right now so what you would do is go to our website, our dev zone, search “Cloud XR” and that should get you to the application page. Put in an application, shoot me an email and I’ll see your application.

If your email explains what you’re doing, then I can consider it for the early gated release, but in a few months we’ll open it up more. It’s available now at no charge. You just have to know that your server-side GPU, where the driver sits, needs to be a Pascal or later-generation graphics card and right now we have APKs clients, we support Windows clients for Vive Pro, we have Android clients for Oculus Quest and for Vive Focus Plus. The SDK that comes with the Vive Focus Plus, the client side, contains some open source, it comes with source code that shows how we built that APK, so if people have devices they want to build an APK for, there’s a template right there. They can use that source code to make their own APKs for their own devices. Welcome to any HMD manufacturer that wants to write an APK for this, we welcome working with them.

As a VR user (an NVIDIA user), I would also like to ask you about the future plans for NVIDIA graphics cards, like the teased RTX 30 series… can you tease us something? (20:08)

I don’t know the secrets behind the new graphics cards, they don’t tell me what is coming, if I say something it would probably be wrong.

Is it possible that in the future NVIDIA targets directly standalone headsets (like Qualcomm) or is it something outside your scope? (21:02)

We have announced a few weeks ago our collaboration with Qualcomm and Ericsson to build products into the Boundless XR solution that Qualcomm is doing. So I think that for the foreseeable future, what I see is really partnering with innovators like Qualcomm on the client side, and really optimizing the CloudXR solution to interact with those SOC solutions that Qualcomm is doing. The XR2 is a quite exciting chip, so I think powering the ecosystem through it is probably our focus for quite some time.

Talking about the future, there are lots of critics because playing games in streaming has never took off. NVIDIA has invested a lot in it… why do you think it has never succeeded and why do you think it is going to succeed, also with XR? (22:08)
nvidia geforce now streaming
Services like NVIDIA Geforce Now and Google Stadia are surely the future, but they haven’t taken off yet (Image by NVIDIA)

First of all, it’s really hard, it’s a really hard problem, because networks are complex, the topologies of networks are complex, so how to get signals across an entire network, where you have so many heterogeneous networks out there, is a difficult problem. That’s why GeForce Now has worked at it for so many years, but I think with the evolving 5G networks and the evolving networks just in general, the solutions that the GeForce Now team have come up with streaming… I think gaming is there, Cloud XR is a first step early product SDK to address the XR market… the thought of working with Viveport and other groups like that that can embed Cloud XR in their platforms… make it available for them to stream XR from their platforms, is really the direction we’re taking.

We think that’s an exciting place to play so as the technology gets better, and GeForce Now is already a great experience, you’re gonna see the gaming market open up and we will be successful in that space because you know, as our CEO Jensen says, everyone’s a gamer. A lot of the gamers will really enjoy GeForce Now and streaming XR and such. They just haven’t had that taste of gaming yet and as soon as they do they’ll switch on so yeah I see nothing but the growth and success in the market from here on… and GeForce now in my mind has led the way for that and certainly their technology is leading the way for what we do in streaming CLOUD XR and again, we want to embed that in in other platforms.

Everyone is dreaming about an AR headset that works by only being connected with the cloud. Playing the prediction game, for when you envision brain-less XR headsets? (24:50)

I saw that Ericsson publication that said 5G in the US would be roughly 50% of the US market in 2023 so I think it is a little faster than that, and then of course the pandemic that there is now is going to affect the growth rate… but the rollout of 5G will take probably you know five to ten years.

That’ll be really instrumental in how fast this all comes about the evolution we see: new HMDs from each of the HMD manufacturers come probably once every three years, so we’re probably two or three generations away from the lightweight fashionable headsets we want to wear.  We’ve got some graphics technology we still have to create to get lightfields and holograms and such into those headsets, so I think in 10 years we’re going to have a really nice solution that is really ubiquitous and in five years we’re going to have some great early solutions that a lot of people will use especially in the enterprise space.

People will be using them to collaborate, to see their models before prototyping… so we’ll see the enterprise lead out: people will find use cases that are really valuable and that’s going to happen inside the next five years and and then another five years for that to go to the consumer market in a robust way.

That will just dovetail with the HMD manufacturers, the device manufacturers, and the rollout of the 5G network… so that’s my prediction. I’m a very conservative guy.

I’ll probably retire in ten years… so I want to make sure… I think that it is 10 years just so I can see it J

timeline ar vr 5g headsets
Timeline for the headset of the future by Qualcomm. They’re in line with the predictions by NVIDIA (Image by Qualcomm)
Your job is working with enterprises about XR solutions. From your experience, how XR and 5G can help the most companies and enterprises? (27:29)

One of the places companies see XR is really helping… there are so many good answers to that… let’s say that I’m building a hospital and one of the things that happens now is we build an operating room, the doctors and nurses go into the operating room and they say “this doesn’t work for us, we need all these different changes” and all those different changes take months, they’re really expensive because you have to remodel real rooms…  so one of the use cases that’s really compelling is taking a big warehouse and you build your environments, then the doctors and nurses come in, they work in the rooms in VR, they come up with all those modifications, and before the hospitals is ever built, you know what you have to build to work for those doctors and nurses…there’s those types of examples.

There’s also the example of… let’s say, on building a new Tesla, and my Chinese engineers want to make some design changes specific to China but they want to work with the California engineers to make that happen, so you go into XR, you bring up the Tesla model and the engineers don’t have to fly from China to the US or US to China, they can just go into XR, work around that model, and make changes.

That collaboration piece is a huge benefit, but then probably the most important place is taking that same operating room (we talked about before) and if I put a virtual patient in there, and I put the doctors and nurses in there, and I let them train in that operating room, then when it’s finally time that operating room is built and it’s time to go in, I already know how to work in that operating room. The training piece of XR in environments, where you want to already be trained before you step into the environment… maybe it’s a dangerous environment and you want to make sure you don’t make mistakes…  that training is probably the most valuable use case there is out there.

But collaborations is not far behind and then testing stuff before it’s ever built is right better also. I’d say training is right now the leading use case and it is really important.

What you love the most about AR and VR? (30:37)

For most of my adult career I and research teams I’ve worked with have been trying to figure out how not to use a keyboard and how not to use a screen but to immerse in our data entirely all the way in.

Let’s take a healthcare case, I want to look at the electrical fields around the head and where those electrical fields are coming from and I really want to understand the anatomy of the brain and how it generates those electrical fields so I can help that patient with strategies to avoid seizures. Being able to jump into the data, go all the way in and not look at something on a small screen or try to click on my keyboard to go in, and actually tear that data apart with my hands. That interaction with your data, whether it’s a car model, a hospital room a training environment, the ability to immerse completely in the data, the digital model, is what excites me about VR and XR. That’s been my career, trying to figure out that human data interface and how to make it where you don’t see stuff around you, just work with your data, and you see that in examples like Minority Report… and all those applications about how to get through data quickly… that’s the dream and it’s what I love about it.

Tom Cruise GIF - Find & Share on GIPHY
If someone wants to do a job like yours, what advice would you give to him/her? (32:28)

It’s a great question… my career has been completely accidental, but my training is in math and physics and biomedical science and so that training really just represents a really broad technical and scientific training. I’ve of course had to code along the way just because every discipline requires you to code, so I would say good maths skills, good science background, good fine arts background… I’m a musician… stuff like that.

Just go where your passions leads and you’ll end up in a job like this. I’ve always taken jobs that are entertaining and fun to do and that’s been the requirement, is that they entertain me… and this is part of the leadership of XR at NVIDIA.

What are your plans for the next year? (33:41)

Really to bring streaming XR to the world, to see how soon can we do that, what pieces of the ecosystem do we have to push on to get XR at the edge sooner… so everything at NVIDIA is about… we see the future, I say 10 years, so “Can we somehow help the ecosystem create platforms at NVIDIA that will reduce that 10 years to four years?”

Everything we think about is how do we get to that future that looks so interesting, how do we get there sooner, and how do we bring it to more people… and so for me that’s streaming XR… and my biggest goal is the democratized streaming XR for everybody, trying to see if I can get it there faster than it should be.

Is there anything else that you want to add to my readers? (34:40)

I think we’ve covered it all. I am excited about this idea of wireless headsets that you just slip on and you’re automatically in whatever data environment you want to be in, and our products are part of the path to getting there.

I just really appreciated chatting with you, it’s been fun… so, no hit message or super message… just check out Cloud XR and we look forward to working with everybody.

Thanks for your time Greg, and I wish you a great day!

Thanks, appreciated!


It’s been a great talk, and I really want to thanks Greg for the time he has dedicated to me and Kasia for having helped in organizing this interview.

Some key points that I want to highlight:

  • The CloudXR SDK is completely free and cross-platform, and you can access to it by applying for it on NVIDIA website;
  • Enterprises will be the first to take advantage of the mix of XR and 5G.
  • XR cloud rendering is awesome and works like black magic, but it has still a long road to go: Greg claims a 50-60ms overall latency of the system that is still a bit too much. It’s good that warping and projections can take the perceived latency down to 3ms though;
  • 5G is really an enabling technology for XR, because it improves the overall networks, and reduces the problems of network saturation and interferences;
  • The full rollout of 5G will require 5-10 years;
  • 5G and Wi-fi 6 are not in competition, but it’s their synergy that can boost our possibilities in interacting with the cloud;
  • In 3 generations of XR headsets we may get to the final design of a lightweight glass that we hope. This will take at least 10 years.

What do you think about it? Let me know your impressions here in the comments or contacting me on my social media channels! And if you like these interviews, subscribe to my newsletter and donate to me on Patreon!

(Header image by NVIDIA)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]