Apple has applied for a patent to do exactly that, according to a post on Patently Apple. The patent also includes details that point to Google Goggles-style visual search functionality. Just as people can use Google’s mobile visual search feature to look up recognizable objects and barcodes, in the future, the iPhone’s camera will be able to pull up relevant local data about an intersection or landmark. So for example, if you’re navigating the streets of a new city, holding your phone up in front of you, the screen will display a live video feed of what’s in front of you and overlay information about what’s nearby, as well as what your next move is in order to get to where you’re headed.
Presumably the feature would also work on iPads, whose bigger screens might be even better suited to cramming several graphical visualizations of data about nearby places and things.
Of course, there are already turn-by-turn GPS apps for iOS, Android and other operating systems, but having an augmented reality-based navigational system that’s native to the phone is pretty unique. And while other vendors may come out with apps that have similar functionality, having this feature baked right into such a widely used device by default could make this futuristic-sounding equivalent to the map and compass approach near-mainstream status as quickly as a new version of the iPhone can sell.
Image courtesy of Patently Apple.