News

This Tech Lets You Move Virtual Objects Like A Jedi

These are definitely the VR powers you’re looking for…

It was only a matter of time. In fact, I’m a bit surprised it took this long, but researchers at Virginia Tech have put their brains to good use and come up with a way for ordinary mortals to replicate the telekinetic powers of a Jedi in VR. 

The recently developed technique is called “Force Push,” and it gives users the ability to move faraway objects with “Yoda-like calm, nuance, and focus” using an approach for remote object manipulation in VR. Users employ their bare hands using a natural gesture-to-action mapping for object manipulation in a VR setting, using subtle hand gestures to push, pull, or twirl objects. 

“You basically push the object in the direction you want it to move to, just like in Star Wars when the Jedi masters try to move an object that’s placed remotely, they can push or pull it,” said Run Yu, Ph.D. candidate in the Department of Computer Science and the Institute for Creativity, Technology, and the Arts. Yu also authored a recently published paper in Frontiers in ICT detailing the research.

And although the fun factor and Star Wars association are more than enough to justify any number of grant hours dedicated to such pursuits, there are plenty of practical applications that also makes this technology rather attractive.

What we’re currently stuck with in most forms of VR are versions of what is effectively a 3D mouse. This is something that I find incredibly awkward at times, and there has been many an occasion when I couldn’t get my virtual hand to do what I wanted or go where I willed it to. Or when an object stubbornly refused to move because I forgot the right combinations of buttons to push in order to grab or release it. 

By contrast, I find the HoloLens interface, which works through a mixture of voice and gesture, rather more intuitive. The technology always did make me feel a bit like a wizard (or witch) in a Harry Potter movie, using my own two hands to cast incantations and prompt holograms to do my bidding.

The researchers wanted to come up with an interface that was not only more playful, entertaining, and fun, but also did a good job of being functional, tweaking the push and pull physics and speed variations until they felt “realistic”. By which they mean that it responds to the speed and magnitude of hand gestures to accelerate or decelerate objects in a way that users can understand much more intuitively.

This ability to respond to more nuanced hand movement is due to the technique’s physics-driven algorithms. Dynamically mapping rich features of input gestures to properties of physics-based simulation made the interface controllable in most cases. According to the release, with Force Push, it’s just as easy for users to apply the gentlest of nudges to an object as it is to throw a heavy object across the room.

“We wanted to try and do this without any device, just using your hands, and also do it with gestures in a way that’s more playful,” said Doug Bowman, the Frank J. Maher Professor of Computer Science and director of the Center for Human Computer Interaction

To test the feature, the team used an Oculus Rift CV1 for display and a Leap Motion was applied for hand tracking. The virtual environment was developed in the Unity game engine, and the native physics engine of Unity was used to drive the physics-based simulation of the Force Push interface.

“Every week we kind of tweak something different in order to make the experience feel right,” adds Bowman. “But now it feels really cool.”

No kidding. You had me at “Jedi.”

Image Credit: Virginia Tech

About the Scout

Alice Bonasio

Alice Bonasio runs the Tech Trends blog and contributes to Ars Technica, Quartz, Newsweek, The Next Web, and others. She is also writing VRgins, a book about sex and relationships in the virtual age. She lives in the UK.

Send this to a friend