Найти тему
Alise Branch

Google’s New Pixel 4 Imagines a World of Hands-Free Computing

New gesture- and voice-based controls promise the dawn of “ambient computing”

In the most recent Avengers movie, Tony Stark unlocks the secret to time travel the same way he invents all of his cool toys: by talking to his computer and waving his hands around in the air. If Google has its way, that’s how you’ll use its upcoming Pixel 4 phone. Minus the time travel bit.

The device will include a radar chip that lets you control your phone with hand gestures — no touching required. The feature, which Google is calling Motion Sense, is based on Project Soli, one of many experimental research endeavors in Google’s Advanced Technology and Projects (or ATAP) division. A small chip in the front of the phone can detect tiny hand or finger movements. Early versions of Soli that Google showed off in 2015 could even measure sub-millimeter movements. Imagine rotating an invisible volume dial or sliding a thumb along your finger to fast forward a video player. That’s the kind of thing Soli was designed to do. And we don’t know how much progress Google has made in the years since.

The hand gestures are reminiscent of the way we’ve seen characters manipulate holograms and control computers in movies from Iron Man toMinority Report. The new gestures in the Pixel 4 are an early attempt at this — although it’s a bit less flashy and futuristic.

We don’t yet know the full extent of these gestures, but an early promo video shows a user swiping at a music app to skip tracks. They’re also not the only “hands-free” innovation to expect in this phone. Earlier this year, Google demoed a new, next-generation version of its voice assistant. It could handle multiple commands in a row, understand follow-up questions, and control virtually everything on the phone. The ability to hold a continued conversation or handle multiple tasks at once already makes the Assistant stand tall over competitors like Siri, and running the voice assistant directly on the device will only make it that much faster.

For example, in the middle of replying to a text message, a Googler asked “show me my photos from Yellowstone,” then followed it up with “the ones with animals.” She then tapped on a picture and said “send it to Justin.” The Assistant texted the photo to Justin. And all of this without invoking the “Hey Google” wake words for each command. While the company has demoed lofty visions of products at I/O before that eventually end up scrapped, like the controversial Google Duplex that almost perfectly replicated a human voice, Google says this version of the assistant will come out on the Pixel 4 later this year.

And all of this extends well beyond the smartphone, of course. Today, you can access versions of Google Assistant in your car, with a smart speaker, smart displays, and even some TVs. In my own home, we have a few Google Home devices spread out enough that there are times I’m not even sure which one I’m talking to. I just say “Hey Google, turn off the kitchen lights” — and it happens.

Instead of a phone or a smart speaker, the “computer” is your environment.

This all converges on a concept that Google refers to as “ambient computing.” Instead of a phone or a smart speaker, the “computer” is your environment. Want to set a timer? Just say so. Need to pause the music? Use your hands to tell your speaker to zip it. While Motion Sense is coming out first on the Pixel 4, the company has already shown how it can be used in other devices like speakers and smart watches. In other words, Google wants this tech everywhere.

Of course, there’s still a lot we don’t know about how this will work. Early Project Soli demos showed people using very fine, arcane gestures to control things like volume sliders, but will people prefer this to what they’re already used to? And how far away from the phone will these gestures work? An inch? A foot? From across the room? Notably, a very early Soli demo showed a user had to get right in front of the screen to employ a gesture. At that point, why not just touch the screen? Samsung has made a similar attempt at gesture control on its phones, which don’t include a radar chip, but so far they haven’t changed how we use our phones just yet.

The new version of the voice assistant raises similar questions. Can you do everything with your voice, or only most things? Even in the demo at I/O, Google’s test user still occasionally had to touch the screen. That might make it hard to send an email from across the room or respond to a text without picking up your phone. It was also a controlled demo. In real life, these things are usually a lot more finicky than they are on stage. A big part of Google’s argument is that its new Assistant will work immediately and for most things on your phone. If the lag is even a hair too long, or if you still have to pick up your phone to complete basic tasks like sending an email, the whole thing falls down.

However, if Google can pull it off — or even get reasonably close — it could change how we use our phones in a way that’s as fundamental as when we started using touchscreens. It’s easy to forget, but there was a time not very long ago when touchscreens were slow, clunky and inaccurate. They weren’t good enough to use as the primary means of interacting with your phone, which might explain why people held onto their button-filled Blackberries for so long. That changed.

Of course, we still use keyboards and mice, and we’ll continue to use touchscreens. The point of the Pixel 4’s new features aren’t necessarily to replace how you use your phone entirely, but to turn the device itself into something that occupies the background of your life. Sometimes it’s convenient to touch it, but maybe other times it’s easier to talk or gesture at it — if you’re working at your desk, cooking, or carrying groceries, say.

It’s a bold vision that we’ve seen in sci-fi for decades, but it’s never really materialized. Google is claiming that you will (almost) never have to touch its new phone. After over a decade of getting used to interacting with our phones through a touchscreen, Google’s ready to throw out the play book and try something new. The question is, are we?