Addressing the 1st challenge of Designing AR Experiences i.e. User Environment

Simulating user environments with my own Adobe XD plugin

--

Introduction

Imagine yourself in your kitchen. The kitchen is the environment and you are the user. Our jobs as Augmented Reality designers and developers is to enhance that environment with virtual content that can prove useful for the user. How will the user visualize that virtual content? Through a Smartphone’s camera or Augmented Reality Headsets.

While thinking about the user’s environment, the following questions should constantly arise in our mind —

  • Where is the user?
  • What sort of experience can we design that can suit the user’s environment?
  • What are the kind of surfaces that might be present in the user’s environment?
  • Will there be sufficient lighting in the room to visualize the user’s environment through a phone’s camera?
  • Is the user sitting or standing?
  • How can I reward the user in order to encourage movement?

Answering the above questions can be challenging to tackle because the answers can be pretty unpredictable. Due to this unpredictability factor, issues may arise based on how you have designed the AR experience. Designers working in major tech companies have tackled this challenge by introducing techniques that can aid in exploring the user’s environment. Let’s take a look at a few of them —

Surface Plane Detection

One of the many ways to understand the user’s environment is by letting the user scan the environment and look for plane surfaces. The detected planes can then be used as a platform to develop AR experiences. This means that once the plane is detected you will be able to place objects on top of those planes.

Planes can be either horizontal or vertical. A wooden floor, a table or a mattress are examples of horizontal planes. On the other hand the walls of a house are perfect examples of vertical planes. These two categories of plane detection have their own use cases.

Trending AR VR Articles:

1. Designing for a modern 3D world: A UX design guide for VR

2. Scripting Javascript Promise In Spark AR For Beginners

3. Build your first HoloLens 2 Application with Unity and MRTK 2.3.0

4. Virtual Reality: Do We Live In Our Brain’s Simulation Of The World?

Horizontal planes can be used to place a variety of objects. Take a look at your own surroundings and see what are the objects that are placed on a horizontal plane. A pedestal fan, a chair, a table, a bed, notebooks, fruits, etc. Ikea is a company that takes advantage of this very technique by letting the users of it’s app place furniture around them. This helps the users understand how the objects might look if placed in a certain corner of a room or in the middle of the room. Additionally they can even play around by moving the objects, resizing them or rotating them in AR.

Horizontal Plane Detection

Vertical planes can be used to place picture frames or tv screens because screens (like a monitor or an advertisement board) are preferred to be viewed vertically. We are used to seeing them that way. You can even place lamps, ribbons or balloons and experiment around with the decoration of some upcoming wedding ceremony or a birthday party. Play around with the texture of the wall to see which wallpaper suits your living room.

Light Estimation

When light falls on an object, you can see a shadow around the object. We are so used to seeing this that we don’t think about it deeply. Our eyes cannot see any object without the existence of light. Without light there are no shadows and without shadows, depth perception will be a challenge. Similarly while creating any AR experience light plays a crucial factor.

3D Designers like modellers and motion graphic professionals play with light and shadows so well that it becomes almost impossible to differentiate between the real and the virtual. When they create 3D assets(for example a 3D representation of a chair) they begin with no lighting. They concentrate on how to create the structure first. You can relate it to drawing. When you start sketching any object, such as a fruit, you start with creating the outline first and then improve upon it with painting or colors that can enhance the realism.

Think of a box. The general structure of that box is a cube. So you create a cube first and then improve the look and feel of it, by implementing details such as drawing a shadow or reflection, to make it look like a gift box or a wooden crate.

Always use wrapping papers

Game designers and artists are experts at creating such experiences due to their passion, patience and hard work. Game environments look gorgeous due to the implementation of nearly accurate lighting
Let’s take a look at two basic forms of light-
1. Point light — light emits uniformly in all directions.
2. Spot light — light emits in one direction through a cone. The width of the cone angle can be used to control the area that it illuminates.

Some other forms of light are directional lights, area light, volume light and ambient light.

Objects closer to the light will be brighter and objects further away will be darker. This is a basic principle of light. Artists use the right mix of lights to create visually appealing and realistic game environments. The same concepts can be followed while designing AR experiences.

Sound Design

Ambient sounds in AR experiences are as important as visuals. Ambient sound refers to the background noise present at a given scene or a location(user environment). This includes but is not limited to noises such as rain, crickets, chirping, a tiger’s roar etc.

Controlling audios can be challenging. Applications right now simply use the phone’s speaker to create audio cues. Consider an experience where the user places a virtual speaker in his or her environment. Once the user moves away from the speaker, the sound should naturally go down a little i.e softer. It is important to understand that the virtual speaker is the “audio source” that is emitting sound in the user’s environment and not the phone.

Combining visuals and audios can create unique experiences like these👇🏻

Source: Reddit

For all AR developers, it’s a challenge to create different user environments just to test the experiences that they developed. It is monotonous to first build the scene in Unity and then test the AR Experience out using a real device. Much of it is boilerplate stuff that actually slows down the overall design and the development process. Generating a build and reiterating the content over and over again is a pain.

Unity MARS

Unity MARS addresses the above problems by simulating different environments conveniently and quickly. This tool will allow Unity Developers to test content against synthetic or recorded world-tracked environments such as rooms with surfaces and other features. MARS places virtual content in the real world by detecting those surfaces.

You can even test the app from the perspective of a simulated camera using a sub-mode of the Simulation view called Device View. So you can basically simulate environments like a Backyard, a Bedroom, Billboard, a Dining Room, a factory, a kitchen and can see those environments from any point inside the room. It is similar to holding a phone and being able to view the environment around you through a camera. This means, you get a good idea of how your content will look like when deployed in a real environment such as your own bedroom or kitchen. Won’t this prove incredibly helpful?

I didn’t get the opportunity to play with Unity MARS yet because it got released while I was planning out this article. I will write an article as soon as I test it out. You can go ahead and try it out as well. Here is the link to Unity MARS.

WebVR Exporter for Adobe XD

Adobe XD is widely used by designers to design user interfaces for smartphones and web. So I designed a user interface for an AR Movies concept where the user can browse top movies and play trailers in AR. This gives the user the freedom to watch the trailer without having any restriction on the width and height of the screen. Below are some of the environments with different lighting conditions and colors which I have used to show what it might look like when placed in different environments.

Notice how the “Top Movies” label is visible in a well lit up environment but not in a dark one

Indeed Adobe XD is great, but I wanted to design the user interfaces and test them out immediately in an environment. A simulated one is also fine. To do this, I could go out from Adobe XD and launch Unity. I could start developing the UI in Unity, and once I am done I could build it out and test it with a real device, but I found that process slow. The questions I asked at this point were the following—

  • Why should a designer working in Adobe XD even think of opening Unity?
  • What if I could let the designer test within Adobe XD itself?

So I developed a plugin that lets you place those user interface elements and anything that you design in a user environment of your choice. See demo below👇🏻

Demo for testing UI elements in a User Environment for AR and VR

The movie interfaces can also be placed in the similar way. You can export any art board in Adobe XD and test it out in real time right inside your browser. The experience is the same as a VR experience where you are the user and you can literally pick any high resolution pic to create an environment quickly and test your 3D assets. You may follow this detailed guide on how to use my plugin to understand this further. I tweeted about this plugin and it got a fairly good response. I also asked some designers and they felt this should be pretty useful.

Tweet

By the way I developed this plugin using a bit of HTML, CSS, JS and the Adobe XD plugin API. I mostly work in Unity but I am working on some Front-end stuff as well(learning from freeCodeCamp! It’s great). So feel free to improve the plugin(much needed) by contributing to the project. This should prove very useful for designers and the AR/VR community in general :)

Tip — While working with User Interface in AR/VR experiences, care should be taken to not overload the user’s environment with data. Show only what is required. Keep the UI minimalistic, pleasurable and user friendly.

Conclusion

Augmented Reality offers you the flexibility of controlling the user’s environment virtually. This plays a crucial role in improving the user’s “immersive”-ness. With the advent of AR glasses in the near future things will improve furthermore. There is a lot more that goes into designing and developing AR/VR experiences. Conversion of good AR experiences to great AR experiences is essential. I will be writing more articles on this topic. If you wish to get notified about the articles or other updates that I share with my connections, consider following me on Twitter, LinkedIn or here on Medium.

Thanks for reading🥳.

Side note

Interaction Design Foundation provides top quality online design courses in collaboration with top universities and companies. My yearly subscription for the Interaction Design Foundation (from where I have learnt UX Design) is ending soon. If you want to learn about UX Design then go ahead and enroll using my url. You will get 2 months free membership if you use my URLhttps://www.interaction-design.org/invite?r=rajat-kumar-gupta.

<Previous Article

Don’t forget to give us your 👏 !

--

--