how to application spacewarp unity

All you need to know on Application SpaceWarp on Unity: how-to and review

At Facebook Connect 2021, Meta has announced Application SpaceWarp, a new solution to boost the framerate of Oculus Quest 2 applications up to +70%. These days I have finally managed to try it, so I can tell you everything that I have learned about it:

  • What is it
  • How does it work
  • How to implement it in your Unity application
  • Pros and cons

Get ready because it will be a very interesting post if you are a developer!

Application SpaceWarp on Quest 2 – A Video with all you need to know

I have made a very long and detailed video on Application SpaceWarp: it’s probably my longest video tutorial up to now. Inside there is everything I’ll write in this article, including a step-by-step tutorial in which I show the creation of a Unity experience integrating Application SpaceWarp from scratch. If you like videos, take a comfortable seat and enjoy it. Probably I’m not super-enthusiast in this video (sorry, I’m quite tired lately), but I think I put inside a lot of interesting content.

If you prefer written words instead, don’t worry, I’ve got you covered too here below. I’ll just assume that you are a bit more pro, and will write the instructions of the how-to integration inside Unity without going into the tiniest explanations of what should be obvious for you. If you are a Unity beginner, I would advise you to watch the video. If words like Package Manager and Git don’t scare you, you will be fine with the textual explanation.

(And whatever you’ll choose, please support me on Patreon so that I can keep writing these in-depth tutorials!)

What is Application SpaceWarp?

Application SpaceWarp is a projection technique that boosts the framerate of your application. Thanks to Application SpaceWarp, which can be shortened in AppSW, your Oculus Quest 2 experience can run at half the frame rate without impacting how the user sees it. That is, if you want the experience to be perceived at 72 FPS by the user, your application can actually render at just 36 FPS, and then the Oculus runtime will take care of synthesizing the missing 36 frames per second so that the user can actually see 72 frames per second. Notice that it doesn’t work only with 72 FPS, but also with 90 and 120 FPS, in these cases rendering at 45 and 60 FPS respectively.

Application SpaceWarp
Rendering times of a frame without and with Application SpaceWarp. Notice that with AppSW, the time budget is much bigger (Image by Meta)

Since the engine of the application must render just half of the frames, it has double the time budget to render each frame, so it can actually render an application that is much more complex than the usual Quest app. Actually, it is not exactly double of the budget because rendering these missing frames needs some time itself, so the total earning can “just” be up to +70%. This is anyway a huge gain, a boost that is comparable to a generation gap of headsets: we imagined having +70% of performances maybe with Quest 3 or 4, while it seems we can have them already now, increasing the graphical quality of Quest apps. That’s why when the news was revealed, all magazines and Youtubers talked about a disruptive innovation with capsed sentences.

But is it actually this good? And how complex is implementing it? Let me answer these and many other questions in the remainder of this article.

How does it work?

Application SpaceWarp works by reconstructing frames and reprojecting them. To do these operations, it requires that the application that runs performs the following operations:

  • It activates AppSW. AppSW is different from other warping techniques in the fact that it is not the runtime triggering it, but it is the developer that requests it. While for instance ASW was triggered automatically when the application couldn’t render at full framerate, so the runtime rendered half of the frames and synthetized the other half, AppSW can never be trigger autonomously by the runtime. It’s always the developer that asks the runtime to trigger it in the application, and can choose to activate and deactivate it whenever he/she wants. Even if you just render a cube, if you want to activate AppSW, you can.
  • It provides the Oculus runtime the depth texture. The depth texture is a special texture that has for every pixel how much the object that the main camera sees through that pixel is distant from the camera. It is a distance map of the world as seen from the main camera. The depth texture is easy to be calculated in many game engines: in Unity it is just a flag in the property of the scriptable rendering pipeline. The depth texture can be at a lower resolution than the eye texture (the actual color frame you see in front of your eyes), because you don’t need it to be superdetailed;
  • It provides the Oculus runtime the motion vector data. This is a bit tricky: the motion vectors define how much the pixels in an image have moved with respect to the previous frame. Let’s make an example: you have just a cube in your scene, and at previous frame, the cube was rendered as a square in the origin of your frame, so in coordinate (0, 0). You are moving sideways, so at current frame, the square has moved right, and it is now in coordinates (100, 0). The motion vector data will be an image in which all the pixels of the square of current frame will contain the value +100, because they have moved 100 pixels right wrt the representation of the same object in the previous frame; and all the other data of the motion vector image will be zero (in the scene there is only a cube, so the rest is background, that doesn’t move, so its motion vector is 0). If you are into video compression, you will recognize that some video compression algorithms perform similar operations, trying to understand how blocks of pixels are moving during the playback of the video. Here the logic is similar: the system reports how the various parts of the images have moved with regard to the previous frame. Also the motion vector texture can be at a lower resolution than the eye texture.
AppSW Quest 2
Example rendered frame (right) with the depth texture (lower left) and motion vector image (lower right). (Image by Meta)

All these data, together with the current rendered frame, get fed to the compositor. Using the data of the motion vector, and given the current frame, the system can predict how the next frame will likely be. Using the example above, if the cube has moved from position 0 to 100 in this frame, probably at the next frame, it will be in position 200. If we have to synthesize the next new predicted frame, that is halfway the two consecutive rendered frames, probably it will be in position 150. Remember that Quest renders at high framerates, so very likely the movements between consecutive frames are very little, and a linear prediction most of the times should work. Using this kind of reasoning with all the rendered objects, the system is able to more or less synthesize the missing frames that the engine is not rendering.

These computed frames have also to be shown in front of the eyes of the user, and the head of the user is moving during all the frames rendering. So we have rendered a synthetic frame, but it has been rendered using the information that we had at the previously rendered frame, so when the camera had the position of the previous frame. We should distort this frame so that to simulate how that rendered image could be seen from the new position of the head. Luckily IMUs have very high framerates, so the system knows what is the head position at the synthesized frame instant, and so it can reproject the computed frame to the new head position, also using the depth data (that can so simulate how distant are the objects rendered in the synthesized frame) to make a more accurate projection, trying to understand how the user at the new head position would see the objects of synthesized frame rendered at the previous position.

Long story short: using all these data, the runtime computes synthetic frames and makes you see them in the best way possible. It can do some magic to fill the gap that you have left from rendering only half of the frames.

If you like technical details, I can also add that AppSW exploits other optimization features already announced by Oculus. Copying from the AppSW announcement blog post:

To reduce latency when using AppSW, we delivered a full pipeline optimization with the following technologies:

  • Phase Sync: Coordinates the application frame and makes it start at the right time according to the application’s workload. HMD and controller sensor reading is performed as late as possible to reduce both HMD and controller latency. Please see our Phase Sync blog post for more details.
  • Late Latching: Further delays the sensor reading time to the end of the CPU render frame, which saves one frame of latency. Please see our Late-Latching blog post for more details.
  • Positional TimeWarp (PTW): Asynchronous TimeWarp can correct HMD rotational error right before displaying, which is enabled in every Quest application. PTW can use depth buffer data to further correct HMD translation latency, which will be automatically enabled with AppSW applications. From a pure HMD latency point of view, we can even say AppSW apps have better latency than full FPS applications without AppSW when using PTW.

These techniques operate at different timings during the rendering of each frame, and you can see when they are applied in the following diagram.

Here you can see when the various technologies used by Application SpaceWarp are applied in the rendering process (Image by Meta)

All of these are interesting tech details, but for sure you are left with the same questions that I had in my mind while reading all the info on the Oculus blog post. How the hell do I calculate motion vectors in my app?

How to calculate motion vectors

I have bad news for you: there’s no easy way to calculate motion vectors. Meta in the development guide for AppSW provides suggestions on how you can implement it, like for instance modifying every single shader that you have in your project so that they compute the difference in the position of the rendered pixels between current and previous frame. It goes as far as providing you the pseudocode on how you can actually integrate it. It is a big work to be performed on the development side, because it requires you to modify by hand every single shader you have in the project… including the standard ones provided by Unity o_O.

And even if you compute motion vectors, it is still not clear how you can pass it to the Oculus OpenXR runtime.

My reaction while reading the requirements for AppSW

Luckily, Meta gives us a helping hand in understanding how to integrate the motion vectors logic and also how to test AppSW out of the box without getting crazy. Meta engineers have worked on a custom version of the Universal Rendering Pipeline with a custom version of its main shaders (URP/Lit, URP/Unlit, etc…) that already implement all the motion vectors and depth texture stuff out of the box. This means that if you implement this custom version of URP in your project and you just use its standard shaders, everything works without you having to change everything.

If you use another pipeline (e.g. HDRP), you should go analyzing what Meta has changed in its custom version of the pipeline with regards to the original version and copy/paste these modifications into your project, modifying your rendering pipeline. The same holds for your custom shaders: analyzing how Meta has modified the Lit and Unlit shaders of URP, you can understand how to modify your custom shaders. Of course, this is may be a hell of a work if you have many custom shaders, and you should discuss with your team if the effort is worth the computational advantage you may have by using AppSW. Ah, of course, at the moment, AppSW doesn’t work out of the box with Shadergraph shaders: Meta is thinking about how to solve this issue in the future, but at present, it is another big issue.

AppSW Unity prerequisite

Shaders and pipelines are not the only things to consider when deciding if implementing AppSW in your project. AppSW at the time of writing works ONLY with Unity 2020.3 LTS from version number 22 onwards (that is from 2020.3.22 on). It doesn’t work with Unity 2019 and doesn’t work with Unity 2021.

This is a pity because I was investigating it to eventually implement it into our game HitMotion: Reloaded, to give it a boost and make it render at 90 or 120 Hz, but since HitMotion is still made in Unity 2019 at the moment, we decided to postpone its eventual implementation to 2022 or beyond. This is another thing you have to discuss with your team: can you afford a Unity version change to integrate this feature? If not, you have to consider other optimization techniques like fixed foveated rendering.

(But since we are talking about it, could you download our game HitMotion and support it with a positive review if you liked it?)

How to integrate AppSW in your Unity project

Enough for the theory. Let’s make our hands dirty by developing an actual Unity application that integrates Application SpaceWarp! And it will be an epic app: The Unity Cube with SpaceWarp!

Actually implementing AppSW in your project is very easy, it is just one single line of code. The hard part is fulfilling all the prerequisites in configuring properly your project. But don’t worry, I’ll spare you some headaches telling you how I solved all the issues that I’ve found during the process.

(As I’ve told you before, this step-by-step written guide is very easy to follow, but assumes you are not a total beginner… if you need further guidance, watch the video above where you see me developing the project live in front of you).

Let’s start by downloading Unity 2020.3 LTS from Unity LTS page. At the time of writing, the last LTS version is Unity 2020.3.24, but you get the most recent one that you can find. When you install it via the Unity Hub, don’t forget to install also Android support, with the Android SDK and NDK, or you won’t be able to develop for the Quest. Also, if you can, don’t install Unity in a directory that needs Admin privileges to be modified (like C:\Program Files), otherwise updating the Android SDKs becomes complicated.

After Unity is installed, create a new Unity 2020.3 project, call it AppSWCube, and be sure to select Universal Rendering Pipeline as your project type. Confirm and let Unity create the project. We choose URP because Meta gives us the modified URP version to make things work out of the box, so it’s convenient for us.

Once Unity has finished creating the project, go to the Build Settings (File -> Build Settings…) and switch to the Android platform. Let Unity do its stuff.

Let’s activate XR Plugin Management. In the Build Settings window, select Player Settings… and in the Settings window, choose the “XR Plugin Management” tab and select Install XR Plugin Management. After the installation is complete, you will see some new options to select the XR Plugin of your interest, for both PC and Android. Select Oculus (for Android, and if you want, also for PC), and let the system do its stuff.

Oculus XR is activated in the project settings

We have just installed Oculus support for Unity.XR, but the problem is that to have AppSW, we need version 2 of the plugin, but at the time of writing, Unity installs just version 1. So, open your project folder with File Explorer, and go to <AppSWCube Project Folder>\Packages\. Open with a text editor the file manifest.json, and inside it locate the line that says “com.unity.xr.oculus”. After it, if you see a version number starting with 1, substitute it with “2.0.0-preview.1”. That line should so appear in the end as

"com.unity.xr.oculus": "2.0.0-preview.1",

After you have done that, save the file, and return to Unity, which should now spend some time to update the Oculus package. You see that you have the right version of the Oculus package because if you go to the Oculus options inside the XR Plugin Management, now you see a check for Application SpaceWarp, which is instead absent in version 1 of the plugin.

If you select Oculus under “XR Plug-In Management”, you see that now there is an option saying “Application SpaceWarp”. This is what we were looking for

It’s now time to download the custom version of the URP pipeline created by Meta that already implements all the Motion Vector magic. It is in a special branch in Meta’s fork of URP, that you can find here. You have to download it to your disk, but beware that if you do that by clicking “Download Zip”, it may make a mess with the LFS (Git Large File System) files in the repo, so my suggestion is to clone the repo that Meta has created. To do that, you have to use your Git client, which must also have Git LFS installed (if it has not, please install it), and use it to clone the branch “2020.3/oculus-app-spacewarp”. These are the settings that I have used with my TortoiseGit client to perform the cloning operation:

Notice how I am cloning the main fork from Oculus, and I am specifying to clone the branch “2020.3/oculus-app-spacewarp”. I am also checking support for LFS

If you are not much into Git, read an online guide around it… it’s pretty easy to use it to just clone repositories 🙂

Downloading the repo takes some time (it’s around 5GB). When the download is over, verify that the LFS files have been downloaded correctly by trying to open some of the image files contained in the directory <URP_REPO_CLONE_DIRECTORY>\Unity-Graphics\com.unity.render-pipelines.core\Editor\LookDev\Icons. If they open correctly, you have cloned the project in the right way, while if opening them, Windows says that the file is corrupted or in a bad format, you have not downloaded correctly the LFS files. Be sure to have LFS installed in your system and that you have activated it for this cloning operation. Use some guide on the web to seek help on LFS.

Once you have verified the installation, let’s continue the configuration of the project by installing the Oculus Integration from the Unity Asset Store. I prefer doing this before actually integrating the URP version that we have downloaded, also because this way I can work on Oculus integration while URP is still downloading to spare some time. Go to the Asset Store, which actually in Unity 2020 is only online, look for “Oculus”, and select Oculus Integration. Click to import it into the project, and at this point, the browser will ask you to open the package with Unity. You confirm it and at this point, the Package Manager opens: inside it, check that you have the Oculus Integration v34 or above, and if it is not the case, update your package. Let the Unity Package Manager import the Oculus Integration into your project. When the popup asks you what contents of the package to import, you select only the “VR” directory and the OculusProjectConfig.asset file. All the rest is not necessary, and import it only if it serves your project: for our cube, we don’t need other special features, so we’ll stick to the simple VR directory.

The Package Manager through which I have imported the Oculus Integration package. Notice that only the VR directory is actually needed to work with AppSW

If the system asks you to update the Oculus plugin say yes, if the plugin asks you if activating OpenXR say yes (“Use OpenXR” button), if you are asked to restart, say yes. Remember that all new Oculus features require OpenXR, so we must have OpenXR enabled.

When Unity has restarted, we have to modify some of the settings of the project to enable AppSW. Open Edit -> Project Settings… .

Select the “Player” tab. Change the “Company name” of the app as you wish. In the “Other Settings” section, verify that Color Space is set to Linear. Then in “Graphic APIs”, remove OpenGL ES and add “Vulkan”, because AppSW only works with Vulkan. This change of graphic APIs may require a reimport of all your assets and so take some time. Check that Multithreaded Rendering, Static Batching, and Compute Skinning are selected. Change the Minimum API Level to 26. For the test, you can keep the Mono scripting backend, but remember that for production builds, you have to switch to IL2CPP backend and ARM64 platform.

Your Settings window should look like this

Select the “XR Plugin Management/Oculus” subtab. You should see some options about the Oculus XR implementation. Set “Stereo Rendering Mode” to MultiView, then activate “Phase Sync”, then activate “Late Latching (Vulkan)” and “Application SpaceWarp (Vulkan)”. The last two options may be in a category called “Experimental” that you have to expand.

The Oculus Settings to set to activate AppSW

Ok, now we are almost done. We “just” need to integrate the custom URP that Oculus has prepared for us. To do that, we have to import some packages from disk: importing packages from disk means instructing the package manager to get a package from a directory we downloaded, so not directly from the Unity Register, but from something on our disk. Be careful that if you share your project with other team members, their Unity versions will expect to find the package in the same disk directory path you have specified for the package. That’s why for shared projects it’s better to have a folder called Dependencies (or with a similar name) in the root of the repository, with inside it the packages that are provided from disk: this way, all team members will have them in the same relative position.

Open the Package Manager (Window -> Package Manager), and then click on the little “+” in the upper left corner of the window and select “Add package from disk…”, then in the Open File popup window, select <URP_REPO_CLONE_DIRECTORY>\Unity-Graphics\com.unity.render-pipelines.core\package.json . When it’s finished, import with the same procedure in this order:

  1. <URP_REPO_CLONE_DIRECTORY>\Unity-Graphics\com.unity.shadergraph\package.json
  2. <URP_REPO_CLONE_DIRECTORY>\Unity-Graphics\com.unity.visualeffectgraph\package.json
  3. <URP_REPO_CLONE_DIRECTORY>\Unity-Graphics\com.unity.render-pipelines.universal\package.json

If during the process, you see your scene becoming violet, don’t worry. At the end of the importing of all packages, the scene should become normal again. And if there are some errors in the console, hitting Clear should delete them all. If you are an expert reader, you may wonder why I am importing “visualeffectgraph”, which is an HDRP-only package… well, believe it or not, when I have not done it, I got some compiler errors, so as long as it works, I import it even if it is not necessary.

If you ask Unity to show the Packages in your project, you should see these four ones. Notice the “+” sign in the top left that is the one that you use to import the packages

If you have some Console errors that don’t go away, the first thing you can do is close and re-open Unity and see if they go away. If it doesn’t happen, go to the Package Manager and remove all the four URP packages implemented above (going in reverse order, from the last one to the first one), and then re-import them again using the same procedure of “Add package from disk…”.

At this point, you should be done. All the prerequisites are set, and your project is ready to run AppSW! But remember that AppSW requires the developer to trigger it, so let’s create a scene, and a script to toggle AppSW on and off.

In your Sample Scene there should be all the sample objects put inside by Unity. Delete them all, and obtain an empty scene. Put inside it an OVRCameraRig prefab, that is in Assets\Oculus\VR\Prefabs and make sure it is in the origin. Then add a cube to the scene, and put it into position (0, 0, 2), so slightly in front of the camera. Now create a script called AppSWActivator.cs, and put inside it this code:

This code calls OVRManager.SetSpaceWarp(true), that is the only line needed to actually activate AppSW, as soon as the user presses the trigger button. It then calls OVRManager.SetSpaceWarp(false) when the user presses it again. Notice how using AppSW is easy: you just need to call one line of code to turn it on and off! The mess is configuring the project, but once it is configured, activating and disabling it is very easy.

Now, it may happen a weird thing when you create the script. Visual Studio may complain that some projects (the ones about URP) are not imported successfully, either by showing you a popup, or writing that one or more projects have not been loaded in your Solution Explorer View. If it happens, close Visual Studio, head back to the package manager, remove all the URP packages (select them one by one and click Remove), and then import them again. After that, opening Visual Studio again should show all the projects imported correctly. I don’t know why this sometimes happens, but solving it is pretty straightforward.

Visual Studio showing that the projects of Unity.RenderPipelines.* are not available (“non disponibile” means “not available” in Italian). Reimporting all the packages should fix your issues

Once you have successfully created the script AppSWActivator.cs, attach it to the OVRCameraRig (or to whatever other gameobject you want, it is not important). You are done, you can finally build and run your experience!

Testing the app

Before actually launching the experience, I suggest you install and run the OVR Metrics Tool on your Oculus Quest 2. OVR Metrics Tool can activate a persistent overlay on your Quest that constantly shows your framerate, and with its latest version, it can also show the effect of AppSW on your application. I always use it to optimize my Quest experiences, and you should do that too.

If you run your AppSWCube experience, you should see the standard Unity Cube, and the metrics tool saying that the app is rendering at 72 FPS, with Asw Type = 0. This means that you are rendering at 72FPS with no AppSW enabled.

Typical aspect of the OVR Metrics Tool. Notice how FPS (first column) is at 72 and ASW TYPE (second column) is at 0, so disabled

Click the index trigger on your left Touch Controller (or right Touch Controller, depending on which one is considered the primary controller): you should see that the framerate on the left drops to 36 FPS, while ASW TYPE is now 1 and ASW FPS is now 72. It means that the Quest is rendering at 36 FPS, but AppSW is now active, and so you are still perceiving 72 FPS. Big success!

The FPS has now dropped to 36, but ASW is now active and it is outputting 72 FPS

AppSW and Unity XR Interaction Toolkit

What if you don’t want to use the OVRCameraRig but you are relying on the Unity XR Interaction Toolkit? Well, don’t worry, you can use AppSW in this case, too. You have to do exactly the same things as before, but be sure to check the flag “Disable Eye Anchor Cameras” on the OVRCameraRig script on the OVRCameraRig gameobject. If you do that, when you add an XR Origin/XR Rig to the scene (be sure it has a MainCamera as one of its children), that one becomes the main camera, and the OVRCameraRig is just used to activate AppSW. This means that both the XR Origin and OVRCameraRig must stay in the scene.

AppSW considerations and caveats

Of course, the cube is a dull example: there is no sense in using AppSW for it because it can reach 72 FPS natively. You should use AppSW for those apps that render around 36 FPS, so that they can render at the full Quest framerate even if they are computationally heavy. To make a more intelligent test, I took the complex Notre Dame model we used for the concert Welcome To The Other Side, which was PC-only. I imported it and assigned it the URP/Simple Lit shaders, I activated Occlusion Culling. Then I added a realtime Directional Light with shadows enabled, so that to make everything more computationally heavy. I positioned the camera in the middle of the church and built the application. Without ASW, the app rendered at 20-40 FPS, and was unusable. With AppSW, most of the time it was above 50FPS, and sometimes even at full 72 FPS! There was only a point of view from which the framerate dropped a lot, but apart from that, the visuals looked incredibly fluid. This is fantastic, considering that the model was absolutely unoptimized for Quest, and that the lighting was fully in realtime! I’m sure that with some optimization work, we could arrive at fully 72 FPS with that scene on Quest 2, something that until some weeks ago we retained impossible. AppSW really enables a new kind of experiences for Quest: when it works, it is impressive, and can bring unthinkable graphics to standalone VR. Kudos to the Meta team for having managed to develop this feature.

The choppiness that you see is due to the mirroring. Inside, the application appeared fluid

Anyway, there are some caveats when using AppSW, and you have to take them in mind:

  • You must fulfill all the prerequisites described above, especially the fact that you have to use Unity 2020.3 LTS and you have to modify your shaders to provide the motion vectors. This may be a problem if you have many custom shaders;
  • When testing AppSW applications, I noticed that the world around me reacted “faster” to my movements. I don’t know how to describe it, but it is like the world overshooted my movements, it is like the visuals were reacting in a way that was a bit different from the usual one. It is a sensation, not something that I can quantify. It is the opposite of the sensation that I have when I increment the framerate from 72 to 120 Hz and everything looks more fluid and reactive. For this reason, I suggest to activate AppSW only if it is strictly needed: original 72 FPS framerate looks better than synthetized 72 FPS framerate;
  • If there are transparent objects, or textures with particular patterns (e.g. very tiny black lines on a white background), AppSW may generate visual artifacts. Transparent objects create always complications in the rendering process, and in fact they also make the calculation of motion vectors more difficult. I made a test with a scene with transparent objects with other opaque objects behind them, and the artifacts I have seen are some duplications of the edges of the objects while I was moving. It was like the edges of the opaque objects behind the transparent ones had a mismatch on how the left and right eyes saw them, and so I had a sensation of blurred double vision of them while I moved. This was noticeable for close objects, less noticeable for distant ones, so you can still use transparent objects if they are not close to the user;
  • The same senation of double vision happened when I used objects rendered with custom shaders not modified to provide motion vectors. I could see them correctly when I was still, but when I moved, I saw them doubling. Again, this was less noticeable for distant objects, so using custom shaders for objects in the background may still be ok;
  • If you use ASW, mirroring of the application to an external display may have fish-eye distortion. Meta team is working in fixing this issue;
  • Abrupt changes of positions (e.g. teleporting) may show visual artifacts, because of course the system can’t predict the next frame of an abrupt change;
  • Application Spacewarp occupies some memory to run, so if your application already uses a lot of memory, this can be an issue;
  • AppSW running your application at 36 FPS means that your input from the controllers get sampled at half of the frame rate. If your application requires to be very reactive to input commands, you may have problems with this.
the unity cube journey reviews
In any case, Application SpaceWarp always renders beautiful cubes 🙂

AppSW is very powerful, but it is not a silver bullet to solve all your problems: you must use it only if the pros are more than the cons. A great suggestion from the Oculus blog is to try it inside your project to see if it works well or not: if you see too many artifacts, or there are problems with input sampling, then AppSW is not for you. If the visuals look great, then it can help you in having much better graphics. Only actually testing it in the field can give you an answer.

AppSW is also not an excuse not to optimize your application: do everything you can to increase your framerate by using the standard optimization techniques (occlusion culling, lightweight shaders, baked lights, etc…) because the more frames you can render, the better.

But again, if used the proper way, it can bring to Quest 2 beautiful rich experiences with graphics that were unthinkable until a few weeks ago. I can’t wait to see what the creators will be able to craft with it.

Further references

Official announcement post of Application SpaceWarp: https://developer.oculus.com/blog/introducing-application-spacewarp/

Full documentation on how to implement Application SpaceWarp in Unity: https://developer.oculus.com/documentation/unity/unity-asw/


And that’s it with this super long tutorial about Application SpaceWarp! I hope it has been useful for you, and if it is the case, support me someway, so that I can keep writing informative posts for the community. Donating on my Patreon, subscribing to my newsletter, and sharing this post on your social media channels would be very appreciated gestures from your side 🙂

And don’t forget to let me know what you are going to do with AppSW and to ask questions if you have any doubts!


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]