Future iPhones could have improved AR thanks to rear-facing 3D sensors
http://www.techradar.com/news/future-iphones-could-have-improved-ar-thanks-to-rear-facing-3d-sensors
The limitations of current AR technology as we see it on smartphones are clear. As I discovered when I tried to use Amazon’s new AR tool that lets you “place” items in your house before you buy them, the traditional camera-based approach does a good job of identifying straight lines for floors and walls but gets stumped when an object like a chair or a bed gets in the way.Apple is hard at work designing rear-facing 3D sensors for future iPhones, which could mean great things for Apple’s ambitions in augmented reality, according to a new report from Bloomberg.
Rear-facing 3D sensors could help with that, as they’d help deliver an experience that’s more akin to what we find on dedicated devices like the Microsoft HoloLens. As things stand, Apple’s ARKit (and Google’s ARCore) does a great job of easing the pathway for augmented reality development, but a sensor on the back of the camera would allow development to jump miles ahead.
The building blocks are already there, of course, as the current TrueDepth sensor on the iPhone X works by spraying its target — your face, usually — with 30,000 laser dots in order to make an instant 3D image. Expand that kind of power to a rear-view sensor, and we’re looking at technology that could potentially map out an entire room.
The rear-facing sensors wouldn’t quite work like that, though. Apple instead plans to use “time of flight” technology that calculates the time it takes for lasers to bounce off the objects they encounter as they shoot out the phone. Numerous companies such as Sony and Infineon already make these kinds of sensors, and Apple is reportedly already in talks for partnerships.