Site icon TechNewsBoy.com

The Google Pixel’s squeeze for assistant was a button without a button

The Pixel 2 is an almost five-year-old phone, but it introduced a feature that I miss more and more with each passing year. It was called Active Edge, and it let you summon Google Assistant just by giving your phone a squeeze. In some ways, it’s an unusual idea. But it effectively gave you something sorely lacking on modern phones: a way to physically interact with the phone to just get something done.

Looking at the sides of the Pixel 2 and 2 XL, you won’t see anything to indicate that you’re holding anything special. Sure, there’s a power button and volume rocker, but otherwise, the sides are sparse. Give the phone’s bare edges a good squeeze, though, and a subtle vibration and animation will play, as Google Assistant pops up from the bottom of the screen, ready to start listening to you. You don’t have to wake the phone up, long-press on any physical or virtual buttons, or tap the screen. You squeeze and start talking.

Looking at the sides of the Pixel 2, you’d never guess it’s actually a button.
Photo by Amelia Holowaty Krales / The Verge

We’ll talk about how useful this is in a second, but I don’t want to gloss over just how cool it feels. Phones are rigid objects made of metal and plastic, and yet, the Pixel can tell when I’m applying more pressure than I do just holding it. According to an old iFixit teardown, this is made possible by a few strain gauges mounted to the inside of the phone that can detect the ever so slight bend in your phone’s case when you squeeze it. For the record, this is a change my human nervous system is incapable of picking up on; I can’t tell that the phone is bending at all.

Whether you found Active Edge useful probably came down to whether you liked using Google Assistant, as illustrated by this Reddit thread. Personally, the only time I ever really used a voice assistant on a daily basis was when I had the Pixel 2 because it was literally right at hand. The thing that made it so convenient is that the squeeze basically always worked. Even if you were in an app that hid the navigation buttons or your phone’s screen was completely off, Active Edge still did its job.

While that made it extremely useful for looking up fun facts or doing quick calculations and conversions, I’d argue that Active Edge could’ve been so much more useful had you been able to remap it. I enjoyed having the assistant, but if I had been able to turn on my flashlight with a squeeze, I would’ve had instant access to the most important features of my phone no matter what.

This version of the feature actually existed. HTC’s U11, which came out a few months before the Pixel 2, had a similar but more customizable feature called Edge Sense. The two companies worked together on the Pixel and Pixel 2, which explains how it ended up on Google’s devices. That same year, Google bought HTC’s mobile division team.

Active Edge was not Google’s first attempt at providing an alternative to using the touchscreen or physical buttons to control your phone, either. A few years before the Pixel 2, Motorola was letting you open the camera by twisting your phone and turn on the flashlight with a karate chop — not unlike how you shuffled music on a 2008 iPod Nano. The camera shortcut came about during the relatively short amount of time that Google owned Motorola.

As time went on, though, phone manufacturers moved further away from being able to access a few essential features with a physical action. Take my daily driver, an iPhone 12 Mini, for instance. To launch Siri, I have to press and hold the power button, which has become burdened with responsibilities since Apple got rid of the home button. To turn on the flashlight, something I do multiple times a day, I have to wake up the screen and tap and hold the button in the left-hand corner. The camera is slightly more convenient, being accessible with a left swipe on the lock screen, but the screen still has to be on for that to work. And if I’m actually using the phone, the easiest way to access the flashlight or camera is through Control Center, which involves swiping down from the top-right corner and trying to pick out one specific icon from a grid.

In other words, if I look up from my phone and notice my cat doing something cute, he may very well have stopped by the time I actually get the camera open. It’s not that it’s difficult to launch the camera or turn on the flashlight — it’s just that it could be so much more convenient if there were a dedicated button or squeeze gesture. Apple even briefly acknowledged this when it made a battery case for the iPhone that had a button to launch the camera. A few seconds saved here or there add up over the lifetime of a phone.

Just to prove the point, here’s how fast launching the camera is on my iPhone versus the Samsung Galaxy S22, where you can double-click the power button to launch the camera:


There’s less thinking involved when you can just press a button to launch the camera.

Neither phone handles screen recording and previewing the camera very well, but the S22 gets its camera app open before I’ve even tapped the camera icon on the iPhone.

Unfortunately, even Google’s phones aren’t immune to the vanishing of physical buttons. Active Edge stopped showing up on Pixels with the 4A and 5 in 2020. Samsung has also done away with a button it once included to summon a virtual assistant (which, tragically, happened to be Bixby).

There have been attempts to add virtual buttons that you activate by interacting with the device. Apple, for example, has an accessibility feature that lets you tap on the back of your phone to launch actions or even your own mini programs in the form of Shortcuts, and Google added a similar feature to Pixels. But to be perfectly honest, I just haven’t found them reliable enough. A virtual button that barely ever works isn’t a great button. Active Edge worked pretty much every single time for me, despite the fact that I had a beefy OtterBox on my phone.

It’s not that physical controls on phones are completely gone. As I alluded to before, Apple lets you launch things like Apple Pay and Siri through a series of taps or presses on the power button, and there’s no shortage of Android phones that let you launch the camera or other apps by double-pressing the power button.

I’d argue, though, that one or two shortcuts assigned to a single button cannot give us easy access to everything we should have easy access to. To be clear, I’m not demanding that my phone be absolutely covered in buttons, but I think big manufacturers should take a cue from phones of the past (and, yes, from smaller phone makers — I see you Sony fans) and bring back at least one or two physical shortcuts. As Google showed, that doesn’t necessarily require adding an extra physical key that has to be waterproofed. Something as simple as a squeeze can be a button that lets users quickly access features that they — or in the Pixel’s case, Google — deem essential.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@technewsboy.com. The content will be deleted within 24 hours.
Exit mobile version