News

What Apple’s Dual Camera iPhone Means For Augmented Reality

Is this a hint of Apple’s future plans for VR and AR?

The rumored iPhone 7 Plus with a dual-lens camera system finally made its debut at Apple’s September product presentation Wednesday. With Phil Schiller taking the stage to unveil the latest iPhone camera update, it was clear that Apple is serious about their “best camera we have ever made.”

Although there weren’t drastic changes made to the physical design of the iPhone (minus that old thing called a headphone jack), Apple’s unveiling of a dual-lens camera points to a world where millions of consumers could one day put a powerful augmented reality device in their pocket.

apple-dual-camera-keynote

It’s no secret that Apple has been exploring the virtual reality (VR) and augmented reality (AR) space for years, which have included everything from headset patents, industry hires and key acquisitions. Most notable of the companies Apple has purchased recently was their 2015 acquisition of LinX, an Israeli firm specializing in squeezing high-definition images out of relatively small multi-lens configurations.

It’s likely that the dual camera system on the iPhone 7 Plus is largely based on LinX’s advanced camera technology that is capable of not only delivering superior camera performance, but also allowing the cameras to intelligently sense depth, which would be ideal for 3D scanning or augmented reality app experiences.

apple-dual-camera-keynote3

With a phone camera that is capable of accurately gauging depth, and in turn, creating depth maps (like this early LinX example below of a single exposure in front of a person’s face), this may be the missing link Apple needed to open the iPhone’s lens up to a world in 3D and one where computer-generated images can be overlaid on top of real-world objects.

linx-array-camera-modules-multi-aperture-3d-pointcloud-apple

With a huge iOS developer base all looking to create the next Pokémon Go, you can expect a few developers to take advantage of the camera hardware update when it becomes available and begin creating the next augmented reality app that may actually be useful this time in our daily life.

So what kind of apps could we expect to see from developers utilizing depth sensing iPhone technology?

The ability to see the distance of objects in the real world is something we as humans take for granted. You and I know how far a wall is in front of us, but a computer, or in this case your iPhone, has a harder time in sensing depth. That’s why opening up a world of depth could help with creating applications for visually impaired people, guiding them to avoid obstacles in a room by providing audio feedback to detect the location of objects nearby.

Then there is the potential of being able to map a room with your iPhone’s camera, allowing you to take more precise measurements and dimensions of the space you’re standing in. Imagine shopping for your next sofa on the iPhone and having the ability to see whether or not it will fit in your living room, overlaying a 3D model of the furniture piece with a real world window into your space.

Of course there are also the entertainment and photo app implications with depth mapping, where Pokemon are hiding behind cars that you have to walk around to catch or Snapchat filters and video capture that brings friends to life in 3D.

But one other potential capability of the dual-camera technology is the ability for gesture recognition. With a camera system that can already sense objects and depth in front of you, why not the hands in front of you. The idea of gesture recognition and depth sensing gets us that much closer to an augmented reality world like that of Microsoft’s HoloLens, allowing finger tapping for interactivity with virtual objects. Or gesture recognition could be used for virtual reality experiences, letting you snap your iPhone into a VR headset like the Samsung Gear VR, giving you the ability to track your hands in front of you and maybe even pick up objects in a 3D immersive world.

apple-dual-camera-keynote4iPhone 7 Plus depth sensing demo during Wednesday’s Apple keynote presentation.

Virtual and augmented reality are two related but different takes on head-mounted or smartphone display technology. With virtual reality, a headset is used to fully immerse you in a 3D environment. Augmented reality on the other hand, overlays computer-generated images on top of real-world objects, something the world is already familiar with from the AR light version of hunting Pokémon.

Even though the iPhone’s dual camera addition may not focus on advanced features like 3D depth mapping yet, as dual camera adoption expands, driving forward future app development and technology improvement, the logical evolution of the device in your pocket will be a tool that will augment the important tasks in your life — bringing you one step closer to a cyborg future.

 

About the Scout

Jonathan Nafarrete

Jonathan Nafarrete is the co-founder of VRScout.

Send this to a friend