This article features the latest episode of The AR Show. Based on a new collaboration, episode coverage now joins AR Insider’s editorial flow including narrative insights and audio. See past and future episodes here or subscribe.


One of the challenges in spatial computing — besides getting over the consumer adoption hump — is building things that are truly native to the technology. As history suggests, we tend to perform “habit creep” in conceptualizing new technologies within the mental models of current ones.

That’s not such a bad thing, as those metaphors can be cognitive training wheels to grasp new paradigms. But eventual success in building products around new tech also requires a native mindset. What are the new interactions the technology affords, and how should they be built?

This is a school of thought that Timoni West has mastered (see our past interview). With a solid tenure in interface design, she now leads XR research for Unity — a seemingly perfect gig for the UX maven. She unpacks her philosophies with Jason McDowall on The AR Show (audio below).

 

Understanding UX design in spatial computing requires first knowing fundamentals of how humans interact with computers. Just like language, computing interfaces are an abstraction layer. And sometimes their logic command structures are too abstracted from user intent or expectations.

“I think language is a very interesting medium insofar as it’s always an abstraction…  And it’s true of programming languages [but] they have to be so very precise to work at all. And that’s not necessarily the case for almost any other type of language that you use regularly… When computers fall short today, they usually do so because they’re using an unfamiliar or unusual or poorly applied abstraction or metaphor. They are unforgiving in a way that is not obvious to the user.”

With that underlying challenge, a few computing milestones have occurred in the past 15 years to provide a better underlying toolset for human-intuitive interactions. The first was mobile, in not just having mobility but being location-aware, which opened the door for contextually relevant actions.

“The computer knew where it was and where it was going. The advent of networked, geo-located devices was a pretty tremendous shift, because that not only attaches itself to basic stuff like maps, but then also could allow for permission sets based around where you were and what you were doing — contextual triggers. That was sort of the beginning of really moving computers into being aware of the world itself in a way that was never done before with traditional desktops.”

The second shift was the Moore’s Law-driven proliferation of sensors. That started with smartphones, but then moved into IoT. It also lays the foundation for key components of the spatial computing stack such as positional tracking, computer vision and biometrics to name a few.

“We have a variety of small portable networked devices that have a tremendous amount of sensor information. So all of the cameras, all of the photoreceptors, all of the microphones, all of the gyroscopes. I think there’s even a barometer in some phones. Extending this out to head-mounted displays, having pass-through cameras, having inward-facing cameras… that allows computers to know more about what’s happening in the world, but also allowing computers to know more about users themselves… that profoundly changes what we are then able to do with computers.”

That brings us to spatial computing, which could be another inflection point in human-computer interaction. As as you can tell, inflections are partly defined by West as lessening the abstraction layer and engendering human-intuitive interfaces. Some of that comes down to the display.

We have these pieces of hardware that we carry around with us and look at and touch and interact with… A counterexample would be a projector, where the digital object is not where the projector is, it’s 20 or 40 or 60 feet away… Head-mounted displays for AR do require hardware, but you interact with them a little bit more like a projector in that where you perceive the digital objects to be has nothing to do with where the hardware is. I think this is a natural next step to feeling as if digital objects are in the real world and have a sense of stability and autonomy [that] the digital world doesn’t have today, when it seems like all of the digital objects are sort of trapped behind the screen. I think this is another crucial step towards making computers feel more accessible and realistic.

But of course, there’s a lot more to creating these intuitive interactions. Apps need to be rethought, says West, from their current siloed nature where the data layer is tied to the app container layer. Decoupling them is a step towards simulating how we shift between tasks in reality.

And then there’s inputs and the ability to infer human intent from things like voice, touch and gestural actions. West believes we’ll have to customize inputs through per-user onboarding, given the nuances in communication across cultures and individuals. It can’t be one-size fits all.

But despite being a realist about all these challenges, West is far from pessimistic. She believes we’ll realize spatial computing’s promise — which will take decades or longer — if we’re positive about how it will impact our lives versus the dystopian route often invoked (if it bleeds, it leads).

“Start thinking about what it is that you want… I find that I do not often meet people who have a clear and articulate view of why they want what they want, or what they think is so cool about [AR]. It’s much easier to go to a dystopia of what we don’t want… It doesn’t have to be a big thing, and it doesn’t have to be across a series of decades, but think about if you had the ability to tie together all the devices in your home, or have AR glasses able to create a virtual space, what would you want? And just think of one use case or one thing that you’d really love to do. And hold on to that and think about how you can bring that to life.”

Listen or subscribe to the full episode at The AR Show or below, and see our archive of past and future episode coverage here.

 


For deeper XR data and intelligence, join ARtillery PRO and subscribe to the free AR Insider Weekly newsletter. 

Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.