At Google’s annual developer conference, held this week near its Mountain View headquarters, the company showed off some of the best practical applications of AI and machine learning I’ve seen yet.
Here’s what stood out for me.
1. Google Lens
The app uses image recognition to identify objects appearing in your camera lens in real-time. It means you can point a smartphone at a flower and be told exactly what it is. Google didn’t have a date for when Google Lens would be available. It did say it would be part of its assistant and photos app at first – though seems to me the most useful way of offering it might be to just integrate it into the camera app.
2. A standalone Daydream headset
Google announced its Daydream virtual reality (VR) platform here last year, along with a nice-looking (but uncomfortable, for me at least) headset that you could slip a smartphone in to create a budget VR
experience. A couple of big announcements on the Daydream front. First, the new Samsung Galaxy devices will work with Daydream. An interesting development because until now Samsung devices worked only with Gear VR, an alternative headset that ran the Oculus VR, owned by Facebook.
3. Very clever photo tools
Google’s Photo app now has 500 million users, its secret sauce being the use of machine learning to sort through your pictures and understand what they contain – such as seeing a birthday cake and grouping pictures from the same day as “birthday party”. The next step is to help you share your pictures more easily. During the keynote, Google discussed how people often take a lot of pictures but then don’t end up doing anything with them.
4. VPS – visual positioning system
Most of us are familiar with GPS – global positioning system – but that technology can only get you so far. Though terrific for travelling around large areas outside, GPS has real limitations when you need something more accurate. Google thinks VPS – visual positioning system – is how to fill that gap. Using Tango, a 3D visualisation technology, VPS looks for recognisable objects around you to work out where you are, with an accuracy of a few centimetres.
5. A better Google Home (and Assistant on the iPhone)
Google Home, the company’s standalone assistant, has made a modest start but still lags behind Amazon’s Alexa device. Google announced a few new features designed to plug that gap. First is calling – you can now make phone calls using the Home, and its voice recognition capabilities make it possible for different family members to call from their own separate numbers through the same Home device.
The device will also now offer proactive information, rather than just answers to questions you have asked. The example given on stage was a warning about heavy traffic – by referencing Google Calendar the assistant was able to know that the user needed to be somewhere at a certain time, and that traffic on the way was heavy.
Source from: http://www.bbc.com/news/technology-39958028