Now in response to Apple’s Live Text, Google has come up with a new innovation for its visual search tool called Places. It is a new search feature that can identify monuments and show information related to it in the camera view.
Places use Google Earth’s 3D map assets and image recognition to identify locations. Once a user initiates the Places filter, Google Lens then recognizes the landmark and a hotspot appears over the landmark. To retrieve related information related to the landmark, users have to tap the hotspot.
“Google Lens is now used over three billion times per month by people around the world. We hope Lens makes rediscovering and learning about your city even more enjoyable.” – Lou Wang, Group Product Manager for Google Lens and AR.
The Places launch is a timely note from Google that it has a significant lead on Apple not only in visual and virtual search but also with visual positioning services.
Google used examples of landmarks in London to demo Places. London is also one of the handfuls of places that will support Apple Maps augmented reality nav and Location Anchors at launch.
But being first does not always bring success. Google introduced the Measure app for measuring items in augmented reality before Apple launched its own measuring app with the same name, but now Google is shutting it down, while Apple’s measuring app has improved with the arrival of LiDAR for iPhone Pro and iPad Pro models.
Follow us on LinkedIn
Read other Articles