oculus quest passthrough api

Oculus Quest Passthrough MR Hands-On Impressions

After Venice VR Expanded has come to an end, I have finally had the time to experiment with Oculus Quest AR Passthrough. I’ll make a tutorial for you about it next week (I know, you can’t wait to make a cube in passthrough AR), and today I want to tell you my opinion about it.

Mild interest of the community

Having worked on passthrough AR two years ago on the Vive Focus Plus, where I created not only some interesting experimental apps but also the opensource plugin to enable AR on that device, I was pretty excited about the news of AR coming also to Oculus Quest 2. I couldn’t wait to put my hands on it, and I expected all the other devs to be pretty excited too. Actually, the reception of the dev community has not been so great as I envisioned.

The proof is the fact that if you look for passthrough content on SideQuest, you just find a bunch of demos (like Cactus Cowboy and Crazy Kung Fu), and nothing more. On social media, we had some popular videos from the creators of Cubism, Gravity Lab, and also a cool AR demo that “teaches” you how to play the piano, and that was very nice, but… I expected something more. I mean, when hands tracking launched, my social media feed was full of experiments and tests with hands tracking on Quest, there were some crazy ideas like the ones of Daniel Beauchamp that went incredibly viral, we had the fantastic experience Hand Physics Lab released, and so on. The community immediately grasped the potential of hand tracking and embraced it totally. With passthrough, it has not been the same: it seems that the community needs more time to understand its potential, so the process of implementing it will be slower.

One of the great experiments on hand tracking by Daniel Beauchamp

I think that there are some well-known issues of passthrough AR on Quest that are hindering its adoption.

The problems of passthrough AR

Passthrough AR is flagged by Facebook as an “experimental feature“, so it should be clear to every one of us that it is in a alpha stage, and it is just offered as a preview for developers to start experimenting. This means we shouldn’t be too harsh in judging it: this is just a first step of a long road towards mixed reality.

Anyway, analyzing its situation at the present time, you can find the following problems:

  • The passthrough is blurred and black and white, so it doesn’t offer a believable augmented reality. Virtual elements are colored, and the passthrough is monochrome: the two parts do not blend well together;
  • Its features are incredibly limited: basically it just offers you passthrough vision of the environment and nothing more. This limits a lot what developers can do with it (more on this later on);
  • Apps with passthrough won’t be approved on the Oculus Store (and I think also on App Lab): so if you develop them, you can’t publish them yet. This ban will lift in some months, according to Facebook, but in the meanwhile, it makes little sense for developers to invest in this kind of experiences they can’t publish;
Cactus Cowboy AR – Passthrough Techdemo has been the first AR demos published on SideQuest. Such an app wouldn’t be allowed on the Official Store yet (Image by Cactus)
  • You can still distribute them for sideloading (e.g. via SideQuest), but this means that the users must have a Quest configured for sideloading. Plus, the users have to remember every time they turn on the Quest to run a command line command (available also through SideQuest graphical interface) to activate Experimental Mode (and so passthrough) on the device. This creates a lot of friction and frustration: I myself forget to activate it a lot of times;
  • You can’t make videos of your apps via Quest integrated video recording feature, and you have to use recording via scrcpy or SideQuest on a nearby computer to make a video of your passthrough app and share it on your social media. This creates other friction for who wants to share content on social media and make passthrough go viral.

These are the many problems that are hurting the usability and virality of passthrough AR. As I said, the feature is still experimental, so that was to be expected.

Passthrough AR hands-on impressions

As a user, I have mixed feelings about the current state of Passthrough AR on Quest.

The first thing I notice when I use it is that it appears blurred, noisy, and black and white. Of course, this can’t compete at all with proper high-quality colored AR passthrough that I’ve tried for instance with the Vive Cosmos XR. Virtual colored elements that you add to the scene don’t blend well with the black and white vision, and there’s no AR magic where the virtual elements seem like real ones in your environment. Plus, there are other visual problems: when you move your head fast, the passthrough suffers from motion blur, and if you move your hand in front of you, you can see the surrounding passthrough areas deforming as if they were underwater. This “trippy” effect can be good for some artistic experiences, but it’s pretty weird in normal experiences.

It’s weird seeing visuals distorting in front of me

As with many other MR glasses, if a virtual element finds itself inside a real element (e.g. the enemy of your game finds itself inside your kitchen table), your brain can’t understand well its position, and when you move your head, you see the virtual elements moving in an incoherent way. It is so ideal to play passthrough AR on Quest in an empty room, so you don’t suffer from this risk.

What is impressive on the good side is instead the stability of the tracking: we all know that Oculus Insight is the best in class tracking solution among VR headsets, and in AR this is even more important. If you put a virtual object in a certain position in the real world, even if you move, that object appears as fixed in that exact position, as if it were a real object in your real space. It doesn’t move, it doesn’t tremble… and this adds a lot of realism to the AR proposed by the Quest. Big kudos to Facebook for that. If they just improve the cameras (maybe in the Quest Pro?) adding RGB-coloring and more resolution, with this tracking accuracy, the proposed MR can become really believable.

Another positive characteristic is that the passthrough is smooth and has little latency, so it looks believable and doesn’t introduce nausea.

The final big problem of its is that passthrough vision unluckily only works if you have Guardian activated. For some weird reason, in the OS the class controlling the Guardian is also the one controlling the access to the cameras’ stream, so if you disable the Guardian, you can’t have MR anymore. This also means that you can’t properly use the Quest as an AR glass everywhere you go: AR is amazing if it works wherever you go, if you can wear it in the streets while you walk (ok, maybe not with the Quest), while with Guardian you are limited to stay in a 15m x 15m space maximum, or anyway how big you have set your Guardian boundaries (usually a few meters in every direction). You can just use AR in that limited area, and this disappoints me a bit, even if 15m x 15m are enough for most experiments.

Quest 2 vs Vive Focus Plus

Our Mixed Reality game on Vive Focus Plus in 2019

As I’ve said, we worked with mixed reality on a VR headset two years ago on the Vive Focus Plus. But how do the two solutions compare? Well, being our solution something working at Unity level (so at a very high level), it couldn’t be optimized, so its framerate was not enough and it also consumed a lot of resources we couldn’t use in our application. People sensible to motion sickness reported a bit of nausea after having used it. Oculus is working at OS level on a much more powerful chipset, so it has an easy life in having better performances than us: the passthrough is smooth and has no latency.

But our system had better clarity and fewer distortions: since the Focus Plus had two front cameras, that were more or less in the position of the eyes of the users, the camera streams were already ready to be used for passthrough and had little noise and little distortions. Quest has 4 cameras in the corners, and so the system has to make many computer vision tricks to show you a video from the point of view of your eyes, and all these operations introduce noise, blurring, and visual distortions. In fact, when I try even today our HitMotion: Reloaded game on Focus Plus, with its passthrough vision from 2019, I am surprised to see that it looks better at a first glance than mixed reality on Quest in 2021. It is as if the visuals were cleaner (even if, as I said, performances were a huge issue with it). Furthermore, the ecosystem on Focus Plus was much more open, and as developers, we could do many more things with it.

It would be great having the great performances of AR on Quest with the clarity of vision we had on Focus Plus. I think that if we want to be serious about AR/MR, we need again two front cameras on headsets (maybe high-resolution RGB ones), in addition to the other 4 ones used for tracking: this is the only way to have clean and undistorted frames for the users.

Vive Focus and Vive Focus Plus
Vive Focus and Vive Focus Plus had two front cameras. This was not ideal for tracking, but it was good for mixed reality

The limitations for us developers

As a developer, I’ve found the implementation of the passthrough on Quest very limiting. At the moment basically, you can only:

  • Add a passthrough underlay or overlay
  • Highlight its edges in a color that you want
  • Colorize the passhthrough providing a mapping between every gray shade to a color that you want (color mapping from black/white to RGB)
    • In the editor, this is provided through sliders that let you change the brightness, contrast, posterization, and colorization of passthrough

And that’s it. There’s no environment reconstruction, no environment understanding (plane detection, so that you can put virtual elements on a desk, for instance), no meshing of the room around you. You can’t grab a frame and analyze it yourself by running computer vision or AI algorithms on it. You can’t even provide a custom shader for the passthrough. Literally, you can just add the passthrough, and this is really limiting.

unity passthrough quest
These are the only settings about the passthrough you can change inside Unity

I spent some hours today trying to hack their SDK, but it is impossible. I had this idea: if I can see the passthrough in my app, I can somehow grab its “texture” in the scene (finding the gameobject that renders it and getting its texture), or at least shoot a picture of the app with a secondary camera and use this picture (that contains the passthrough) as I wish, like for instance sending it to some AI system to analyze. Well, I spare you the effort: there’s no way of getting the passthrough frame in your app. Literally no way.

Passthrough is not rendered in the Unity scene, it is rendered as a special OVR Overlay. Long story short: it never gets rendered inside your Unity app, it never reaches the eye buffer, so you have literally no way of accessing it through your Unity project. The Unity scene that you create, with all the virtual elements, gets rendered on a frame that has a special alpha layer that specifies for every pixel the final transparency of the virtual scene in that pixel (e.g. the parts of the scenes with virtual elements are opaque, the parts with no elements are fully transparent, the parts with transparent elements have an intermediate value), and then this frame gets sent to the compositor (the render engine of the Quest). The compositor takes the frame of the passthrough directly from the OS (properly projected on a special mesh), and it puts it below (technically, it can be also on front, if you specified it as an overlay, but let’s stick it to the most common case) your Unity frame, as if they were layers on Photoshop, and your virtual frame is a layer on top of your passthrough frame. The resulting image is so the composition of the two, with appropriate transparencies. The passthrough doesn’t get rendered in Unity, it is composed directly by the OS with your virtual elements as if they were two separate images put one on top of the other: beware that this may also cause some visual glitches in your Unity application with some semitransparent objects.

overlay passthrough oculus unity
At runtime, your application generates an additional gameobject with an OVROverlay behavior, with some flags set to tell the OS to basically provide the textures itself, so you have no way to access or see it

This means that you have no way to get or analyze the frames of the cameras, you can’t even modify them, so you can only use the few features offered by Facebook. This has been made for privacy reasons (no developers can maliciously record your surroundings), but it of course limits a lot the applications of AR/MR: with no environment understanding, no planes detection, no frames analysis, AR just becomes a background for an app and nothing more. This is much less than what you can do with ARKit, ARCore, 8th Wall, and all the other most common AR systems, but also much less than what it is possible to do with Vive SRWorks and MRTK on HoloLens.

The only super-hacky way I have been able to think to let you analyze the passthrough frames in your Unity app is the following one: you connect the Quest to the PC with the cable, then launch your application, then via Wi-Fi you send a command from your Unity app to a companion app on your PC that triggers a screenshot of the Quest app via ADB, then the PC companion app analyzes that picture, and then send the result back to the Quest: too hacky, too unstable, too annoying to configure… not worth the effort. But it should work anyway.

When we unlocked mixed reality on the Vive Focus Plus, we could modify the frames, so for instance I created this Matrix vision filter you can see below; or we could get the frames and send them to OpenCV, and I used this to make an ARuco trackers detector and a QR Codes reader. I invite Facebook to open up passthrough a bit more, and offer more features for us developers for when it will exit the “experimental” phase so that we can really unleash its full potential.

This video’s resolution is not ideal to see that, but all these green squares that you see are actually letters and numbers flowing like in The Matrix

Creativity is still possible

It is still possible to make interesting things with Passthrough AR on Quest, though. The demo with the piano is cool and shows that if you go through a real-world / virtual-world calibration stage, you can still do amazing things, for instance. This is one of the possibilities in which I see more potential: letting the users make a calibration setup and then put virtual elements on top of real ones and interact with them in AR: for instance, if you specify with the controller where is a real wall, then you can throw a virtual ball towards it and see the ball bouncing on it.

Another big potential is enclosed in the colorization and the edges of the passthrough, and I think it has been exploited very little until now. I took the Beat Reality program we of New Technology Walkers made with Enea Le Fons of UXR.Zone on the Focus Plus, and made a quick experiment emulating it on the Quest 2, and I noticed that it was very cool also on this peripheral. Beat Reality shows you the colored edges of the environment around you, and they pump following the rhythm of the music you are listening, changing color depending on the intensity of the music. It is an audio-visual experience like no others, something akin to a portable discotheque for you and your friends… and I love about it that it gives life to passthrough. Passthrough is not only a background in this application, but it is the most important part of the experience, it is alive.

Watch this short video and you understand what I’m talking about

I also had fun with removing all the real-world visuals altogether and just leave the echolocation view and played with seeing myself this way inside a mirror…

…and today I plan to make some experiments with hands tracking together with passthrough vision.

And then of course there is HitMotion: Reloaded, which has born as a mixed reality experience and that… well… I can’t say much…

What I want to say is that even with its limited features, it is still possible to do original things with passthrough AR on Quest. I invite you all to think outside the box and create something yourself. Of course tag me (@skarredghost) when you share your creations, because I’m very curious to see what you will develop!


And that’s it with my thoughts on passthrough AR on Quest! Get ready for next week when there will be my tutorial on how to implement it… and some news on some game…

(Header image by Facebook)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]