The patent suggests that the system would rely on GPS, WiFi, and Cell ID, along with motion sensors to transmit information about the live feed to a network service, which in turn could transmit back 3D imagery of the recognized objects, which the user could then navigate in real-time. Apple was also quoted as saying:
“For example, gyroscopes, magnetometers and other motion sensors can provide angular displacements, angular rates and magnetic readings with respect to a reference coordinate frame, and that data can be used by a real-time onboard rendering engine to generate 3D imagery of downtown San Francisco. If the user physically moves device, resulting in a change of the video camera view, the information layer and computer-generated imagery can be updated accordingly using the sensor data.”
The patent further suggests that the user could drop a "pushpin" on a building in the distance, and the software would then provide directions on how to get there. It would be pretty cool if through the use of AR (augmented reality), the software would be able to tell you to cut through a building (if you're walking) to get to your destination quicker.
It sounds like a pretty cool piece of technology although we don't think we will be seeing that in the next iPad; iPad 4 perhaps?
source: Ubergizmo