Google I/O highlights

 

image

During a non-stop, two-hour keynote address at its annual I/O developers conference, Google unveiled a barrage of new products and updates. Here’s a rundown of the most important things discussed:

Google Lens

Google CEO Sundar Pichai kicked off the keynote by unveiling a new computer-vision system coming soon to Google Assistant. Apparently, as Pichai explained, you’ll be able to point your phone’s camera at something, and the phone will understand what it’s seeing. Pichai gave examples of the system recognizing a flower, a series of restaurants on a street in New York (and automatically pulling in their ratings and information from Google), and the network name and password for a wifi router from the back of the router itself—the phone then automatically connecting to the network. Theoretically, in the future, you’ll be searching the world not through text or your voice, but by pointing your camera at things.

With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17pic.twitter.com/viOmWFjqk1

— Google (@Google) May 17, 2017

New AI hardware

Google announced a second generationof its specialty AI hardware, called tensor processor units (TPUs). While the first generation of TPUs were only for internal company use, the technology will now be available for use on Google’s burgeoning cloud service business. The move allows Google to offer a faster cloud service to companies, but what it really shows is how important the company thinks the cloud business is to its future. Google is betting that its AI tech can set the company’s cloud service offerings apart from competitor—and the company wants ensure it won’t need to rely on third-party hardware vendors to keep its AI advancing.

Google Assistant updates

Through Google Lens, Google’s digital assistant can now understand the world as you see it. If you open it up and point the camera at the marquee at a music venue, it can recognize the band name, play their music, and even book you tickets to the concert.

Now available on 100M+ devices, your #GoogleAssistantis your own personal Google, always ready to help. #io17pic.twitter.com/MwlKhtgL6M

— Google (@Google) May 17, 2017

Assistant is now available on the iPhone from today (it had previously just been available on certain Android phones, and the Google Home). Google said that it’s also launching an Assistant developer kit, so that third parties will be able to build devices that use Assistant for voice interaction. This coming holiday season, expect to see advertisements for all sorts of speakers, phones, and internet-of-things devices with Assistant built in.

Google is also rolling out Assistant in a range of new languages, including German and Korean, and will soon be giving developers the ability to build purchasing tools for the system. Google showed off an example: it placed an order for a Panera salad through Assistant, nearly as easily as one might speaking to a real person working at the restaurant, even asking for (and getting confirmation of) substitutions.

Google Home gets homier

At last year’s I/O, Google unveiled Home, a smart-home speaker and answer to the Amazon Echo. At this year’s event, it learned a few new tricks. Google announced that Home will soon be able to provide users with proactive notifications—for example, if you have your work address saved in Google Maps, it’ll notify you if you need to leave earlier than usual for your usual commute. The Google Home will light up when it has something it wants to tell the user, and they’ll just have to ask what it wants to say.

Coming soon to #GoogleHome: hands-free calling. Call businesses, friends and family in the  and , even if you can’t reach your #io17pic.twitter.com/hvG6wtS9qf

— Google (@Google) May 17, 2017

The Home will also soon be able to make calls to any phone in the US and Canada, about a week after Amazon announcedthat its Echo speakers can call anyone else a user knows who also has an Echo. But Google takes Amazon’s effort further by allowing calls to any phone, and by using contextual understanding of knowing who’s talking to it. For example, if you ask a Home to call your mother, it will call her, but if your spouse says asks the same question, Home will know to call hers instead.

You’ll also soon be able to view content you’ve asked for on a Home on other devices. For example, if you ask Home for directions to your next meeting, it’ll automatically send directions from Google Maps to your phone. You’ll also be able to view information you ask Home on any screen that you have connected to a Chromecast streaming device.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s