Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward

Ingenious, Radical, and Exasperating: Google Pixel 4’s Gesture Controls Could Be the Next Siri

October 14, 2019, 11:00 PM UTC

Google’s Pixel 4 smartphone, to debut at a press event on Tuesday, may not generate the same public interest as a new Apple iPhone. But it still promises what could be a truly revolutionary feature: remote gesture-based controls that actually work.

Gesture controls are pitched as a more convenient way to interact with digital devices instead of pressing buttons on screens or swiping with a finger. A teaser video that Google released in July, for instance, shows a Pixel 4 user ‘swiping’ the air above a phone to skip songs on a streaming music player. Concept videos from Google have also shown virtual volume knobs ‘turned’ by rubbing a thumb and forefinger together, or imaginary ‘buttons’ pressed in thin air.

But despite its prominence in sci-fi films like Minority Report, gesture controls have a spotty history in the real world. Device makers including Microsoft and LG have previously tried letting users control devices from a distance with hand gestures, but the technology has generally been clunky. Past systems, for example, relied on a phone’s camera to track users’ gestures—something that was difficult to do in dim light.

From camera to radar

The Pixel 4’s gesture detection, by contrast, will be based on a form of miniaturized radar, known as Project Soli. We won’t know for sure until the Pixel 4 debuts, but Google’s willingness to put the feature front and center on a flagship phone suggests the switch to radar may lead to a real breakthrough in usability.

“It is a shocking departure from all the other [gesture control] strategies out there,” says Dr. Karen Panetta, dean of graduate engineering at Tufts University.

Google’s Advanced Technologies and Projects division publicly introduced Project Soli in 2015, when swapping cameras for miniature radar was a largely untried concept. The project, part of Google’s cutting-edge Advanced Technologies and Products group, is led by interface specialist Ivan Poupyrev.

Three years later, the experiment has yielded a marketable product. Project Soli’s radar system is now so tiny that Google says it fits on a single chip. The radar will be located at the top of the Pixel 4’s screen—a placement that involves some aesthetic compromises— near a camera and other sensors.

Gesture control has stymied many other tech companies. In 2013, Samsung’s Galaxy S4 included a feature called Air Gesture, which received a lukewarm reaction at best and was less prominent on subsequent Galaxy models. Similarly, Microsoft’s Kinect—a device largely designed to enable gesture-based interactions for video games—pulled support for key features in 2015, citing a lack of interest among users. More recently, the LG G8 ThinQ’s Air Motion touchless video game controls were described by tech news site Cnet as “rough around the edges” and “annoying to use.”

All of these troubled attempts tried to provide gesture-control from camera or infrared data, which presents a variety of basic challenges. Past gesture controls were often unusable in low light or depended on very specific hand positioning. And it requires complex A.I.-driven processing for a device to understand the position of a hand from a two-dimensional camera image.

According to Panetta, radar delivers much richer positioning data than images, making it less reliant on A.I. processing. “Not only is it removing that step,” she says, “It’s giving you much more accuracy, under much more variable conditions.” Radar works well in the dark, for instance, and even through clothing.

Like other A.I.-based features, such as voice control, Soli will likely improve over time. But if it’s buggy out of the gate, the feature may be rejected by consumers, like they did with previous efforts by other companies. “The cost of doing business is, it has to be absolutely perfect,” says interface designer Chad Currie of consulting firm Slide UX. “All it takes is one time for it to misinterpret [a gesture], and you’re like, I’m done with this.”

Sci-fi, meet reality

Even if the Pixel 4 and Project Soli deliver on their promise, there are big questions about whether people will actually use gesture-based controls, and in what contexts. Currie questions whether you’ll see people waving fingers over their phones on the street, saying: “There’s kind of a social cost of doing weird stuff in public. I don’t think it’s going to be an everyday alternative to the way you use your phone now.”

The Pixel 4 teaser video suggests Google may be thinking along those lines. “Notice that the phone is set up away from the user, not in their hand,” Currie points out, signaling that it be more useful as a kind of remote control at home.

Intuitive gesture controls could be used to adjust climate or lighting while you’re cooking, or to control video or streaming music from the couch. Those capabilities could help Google make inroads against Amazon and Apple’s smart-home and media ecosystems.

But Soli-based gesture control may end up having its biggest impact in smaller niches. Bill Konrad, co-CEO of the digital design and consulting firm Konrad, points out that gestures may be useful in environments that are either very quiet (offices) or very loud (factory floors), making voice controls impractical for remote interactions. Currie speculates gestures could also become a more fluid, intuitive input method for creative tasks like 3-D modeling or music composition, and the possibilities for gaming seem plentiful.

According to Panetta, gesture controls could also be used in accessibility features for the disabled. Gestures may, for instance, accommodate users with less fine motor control than are required to use, for instance, a mouse or touchscreen.

But even assuming Soli catches on, it’s impossible to predict exactly how it will be most transformative. “Very often where technology drives a ton of value is the thing you don’t see on day one,” says Konrad. Google, for its part, refers to the Pixel 4 as the “first device” with Soli, suggesting it’s poised to integrate the tech into other devices in the future. One of Poupyrev’s many patents shows gesture-recognition built into a smartwatch.

Google declined to answer questions about its plans.

Of course, the first test will be whether Soli works in the Pixel 4, either as a functional tool, or as a marketing hook that improves sales. On Tuesday, we should get a deeper look on both fronts.

More must-read stories from Fortune:

How to claim a cash settlement of up to $358 for Yahoo’s data breaches
Apple Card’s newest benefit: relief for natural disaster victims
—Now hiring: people who can translate data into stories and actions
Is A.I. a trillion-dollar growth engine or a jobs-killer? There’s reason for optimism
—The gaming addiction center in the U.K. is a sign of the future
Catch up with Data Sheet, Fortune’s daily digest on the business of tech.