I think most of us have experienced this experience, especially when you're in a big city: leaving public transport, look at Google Maps to find out how you should go… And then somehow go two blocks in the wrong direction.
Maybe the small blue dot wasn't the right place yet. Maybe your phone's compass has broken down and encountered the wrong path because you are surrounded by 30 tall metal and other things that hate you.
Google Maps unfinished reality mode script, drawing arrows, and characters into your camera image so you can make an extra, super conviction that you're going right. It compares that camera view with a huge set of Street View images to clear exactly wherever you are, and how you encounter even if your GPS and / or compass can be turned off slightly. This is currently under alpha testing, and I spent some time this morning
A little look at what the action will look like:
Google first reported AR pedestrian direction about nine months ago And the conference, however, has been quite quiet since then. Much of the time was spent to clarify the subtleties of the user interface. If they drove a particular route on the ground, the early users tried to stand directly when trying to walk even if it was not necessary or safe. When they tried to use particle effects in the air, represent roads and curves (listed below in any early prototype), a Google UX designer says one user asked why they were "after floating garbage"
] The map team also learned that no one wants to keep their phone for a very long time. All the experience must be fast enough and designed to be used in short queues – in fact, if you hold the camera too long, the program will stop you.
to AR mode feels like the start of any other Google Map trip. Start at the destination, press the pedestrian direction button… but tap the "Start AR" button instead of "Start".
The screen shows a view from the camera, and the program asks you to point the camera at the street. When you do, a bunch of points will appear as it recognizes the properties of the building and the landmarks that can help you determine your location. Soon – in seconds – in our tests – points will disappear, and arrows and markers will appear on your way. The small bottom view at the bottom shows your current location on the map, which is a good job when the transition from camera mode to map mode is slightly lower.
to a more natural position – closer, parallel to the ground, how can you store it if you are reading texts while walking – Google Maps will go back to the standard 2D map view. Hold down the phone as you take a portrait of what is in front of you, and the AR mode will return.
In our short test (about 45 minutes), this feature worked as promised. It really works better than some scenarios; If you are closer to the street and you see buildings on the road better, its location is pretty fast and funny. If you are in the middle of the square, it may take a few seconds.
Google's decision to create a tool that is only a few seconds is right. No one wants to ride the city first through a camera camera lens, as it is a light target for future phone thieves or walking to the light poles. I can see myself using the first step or two steps to make sure I get to the right foot and then hope to look at the standard map sometimes. This means you are helping to feel more confident rather than keeping your hand all the way.
Google dipped deeper into how Tech works here, but in short: it looks at your camera and sends a compressed version to the cloud where unique visual features are analyzed. Google has a good idea of where you are from your phone's GPS signal, so it can compare Street View data it needs to be able to look for things it thinks should be nearby – some of the features of the building statues or permanent structures – and work back to a more precise location and direction. There is also a voodoo bunch of machine learning to ignore items that may be important, but not necessarily permanent (eg Trees, large parking lots, and construction).
. Local managers are an optional user group that contributes to review, photos, and locations, helping Google check location information in exchange for early access to such features.
Unfortunately, Google has repeatedly told us that it has no idea when it will go beyond this group