
When Apple announced iOS 15 almost four years ago, one of the features added to Apple Maps was support for immersive walking directions using augmented reality. It’s a neat feature because after you start using the Maps app to walk to a place, you can just hold up your iPhone, and once the camera senses the buildings around you, the iPhone screen changes to a live view of the outside world with directions superimposed on top. This makes it crystal clear which way you need to walk. It is much better than looking down at a 2D map and trying to figure out which way you are currently facing.
When this feature first launched, only a few cities were supported: London, Los Angeles, New York, Philadelphia, San Diego, San Francisco, and Washington D.C. I never had both a reason and an opportunity to try out the feature in any of those cities, so I hadn’t used it. But as I noted this past Friday, Apple recently enhanced the Apple Maps data in New Orleans, which is where I live, so I have now had a change to try out the feature. It works really well.
This past weekend, I was at the Audubon Aquarium and Insectarium with my kids, which is right next to the Mississippi River. When we were done, my daughter requested that we get ice cream, so I used Apple Maps and found a store that was close and initiated walking directions. Next, I held up my iPhone, and I could see exactly which way to walk next thanks to large arrows and street names superimposed on the world around me:

Once we got close enough that the store was just across the street, the Maps app showed me exactly where the store was located, making it unnecessary to figure out the precise street address:

And the pin remained there as we got closer:

I realize that this feature is not new to Apple Maps. And I realize that Google Maps also has a similar feature called Google Maps AR Live View. But this was the first time that I tried this feature, and it was nice to use.
While using this feature, I couldn’t help but think about how incredible it would be to have this feature embedded in a future generation of the Apple Vision Pro, one that is compact enough to be the size of a normal pair of glasses. Let’s call it Apple Glasses™. It would be great to be able to look at the world around me just like I always do and also see relevant information—such as Apple Maps directions—superimposed on the world around me. I have no doubt that this is coming in the future. It is just a question of when, and how much it will cost.