Google Home Max, Nest Home Automation, Machine Learning: Can Anyone Beat Google?

The theme of Google's big press event last week was: Stuff just happens. No downloads. No "skills." New Google Home Max, potential Sonos buster, listens and learns for "smarter sound," and Nest home automation magically joins the ecosystem.
Published: October 10, 2017

Google launched a bunch of new products and features last week, including the Google Home Mini ($50), Google Home Max ($399), Pixel 2 phone, Pixel Buds earbuds, and ongoing machine learning that makes everything smarter, from voice recognition to image searches to home automation.

The big one for our purposes is Max, a direct Sonos competitor. Google promises superior hardware vis-a-vis Sonos (“We’re obsessed with bass”) – an argument we might challenge. It also promises smarter smarts than Sonos, which is hard to argue.

In fact, when it came to intelligent devices and ecosystems, Google spread the jabs evenly across Sonos, Amazon Alexa, and all things Apple during its press event. They made a compelling case, again and again and again.

In one video, a user calls out, “OK Google, play the song that goes like this: [hum, hum, la, la, feigned instrumentals].”

In another (from the Sonos presentation, no less), a user requests, “OK Google, play me the Johnny Cash album with the black cover.”

Learning to Listen: Google’s Smarter Sound

The timing for the big Google press event couldn’t have been better: It took place right after the big Sonos event, where the speaker manufacturer introduced Sonos One, featuring six microphones, Amazon Alexa and (in the future) Google Assistant built in.

Google managed some pretty good digs against Sonos, not to mention Alexa and Apple, for good measure.

“We can do with two microphones what others need six or eight mics to do,” said Rick Osterloh, SVP Hardware at Google.

Just 30 minutes earlier, Sonos had boasted that its new Sonos One speaker had six microphones built in.

Related: Sonos One with Alexa and Google Assistant, Plus ‘More Open’ API

Osterloh explained that other manufacturers needed all those mics because they couldn’t learn on-the-fly like Google can. Google continuously improves its listening capabilities and calibrates the system to improve speech recognition and other performance metrics.

Well, maybe Sonos wants all those microphones for room EQ. Sonos introduced its Trueplay speaker-turning software two years ago, and boasted during the press conference last week that the new product would double down on the technology.

Google, though, doesn’t need all those mics for calibrating speakers, thanks to the company’s new “Smart Sound” feature, powered by Google AI.

The technology draws from “thousands of different room configurations” learned into the machine by Google, said Rishi Chandra, VP product management. But the thing keeps on learning when it lands in the home.

“This is all done dynamically,” he says, “so if you decided to move Max a few feet, it’ll compensate within seconds.”

Over time, the product will adjust to “fit your context,” he says, “lowering the volume in the morning, raising the volume when the dishwasher’s running, or just adjusting the tuning based on the type of music you’re listening to.”

Mics? We don’t need more stinkin’ mics.

More Learning, Less Hardware

The same sentiment holds true in the case of Google’s new Pixel 2 phone, which does less with more. The Pixel camera purportedly accomplishes what iPhone does in Portrait Mode – sharpens the face, while blurring the background – with just a single lens. iPhone requires two.

Google’s device employs “dual-pixel sensor technology” that requires “just one camera and machine learning.” (To wild applause, Google announced the feature for front-facing cameras as well.)

Google knows what a face is, and knows what background is because the search-engine giant has examined billions of images over the years.

Just as magically as Google plucks my niece and nephews out of thousands of pictures, even as they age, the company also can control my home, even as habits change, and people come and go.

Exploiting these images – and the way users label them or doctor them or share them – has plenty more implications, as Google points out.

A product manager remarked: “Have you ever asked yourself, ‘What kind of dog is that?’”

Identifying animals … or landmarks … or homes for sale is a snap, with just a snap.  Advanced computer vision and context bring these new use cases to life.

That’s one reason Google Photos allows free unlimited storage for images and videos – even 4K quality – snapped and uploaded by a Pixel phone. The context is golden.

Soon, Google will be able to label all of our content for us, including video – frame by frame – via cloud-based AI tools. My thousands of tradeshow images and hours of video uploaded to Google Photos will be scanned for familiar products and people. Signage will be translated to text with Google’s OCR capabilities.

What does this have to do with home automation? Plenty.

The same technologies are applied to cameras for facial recognition (who is that person and are they happy or mad?), as well listening devices for sound recognition (who’s talking, is the baby crying, does that knock belong to the postman?).

Getting to Automation

Google is getting us that much closer to the “automation” part of home automation. We don’t need to punch buttons or get up from the couch, heaven forbid. We just need to, once in a while, accept or deny various intrusions when Google suggests it.

Just as magically as Google plucks my nephews and niece out of thousands of pictures, even as they age, the company also can control my home, even as habits change, and people come and go.

As Google builds out its ecosystem, all devices learn to get along together, seemingly out of nowhere … like when you get that first alert on your new Pixel phone, urging you to leave for the airport in 10 minutes based on traffic conditions and your flight departure time. Whoa!

Suddenly, your new Nest Hello video doorbell, employing facial recognition, is broadcasting “Susie is at the front door,” through all your Google Assistant devices.


Next: I’ve Seen the Future of Home Automation and it Starts with Google Now


And, while watching TV, you simply ask Google to “Show me the entryway,” and the camera image appears on-screen thanks to Google Chrome.

This isn’t the future. This is a use case for today, as demonstrated at the Google press conference.

Now, take your Nest Cam IQ cameras, with Google Assistant built in, and tell it to unlatch the Yale Linus lock for Susie … all without having to touch the cloud thanks to Nest’s Thread and Weave smart-home protocols.

Speaking of which: There was no mention of Thread and Weave and the required 802.15.4 radio in the new Google Home Max or Mini. If it’s not in there … what a missed opportunity to communicate locally with Nest thermostats, smoke detector, cameras, doorbell, Nest Secure security system and sensors, and whatever Thread/Weave products might emerge in the future.

In all of this, stuff just happens.

Rishi Chandra said it, and most of her colleagues at the press event echoed (tee hee) the sentiment: “You don't have to teach Google new skills. It just works.”

Google Press Event … In Tweets

 

Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series