Dieter Bohn, the Verge:
But there was one line on Google’s support page for the update that caught my eye (emphasis mine): “In addition to long press, you can now firmly press to get more help from your apps more quickly.”
Tap your screen right now, and think about how much of your fingertip is getting registered by the capacitive sensors. Then press hard and note how your finger smushes down on the screen — more gets registered. The machine learning comes in because Google needs to model thousands of finger sizes and shapes and it also measures how much changes over a short period of time to determine how hard you’re pressing. The rate of smush, if you will.
I have no idea if Google’s machine-learning smush detection algorithms are as precise as 3D Touch on the iPhone, but since they’re just being used for faster detection of long presses I guess it doesn’t matter too much yet. Someday, though, maybe the Pixel could start doing things that the iPhone used to be able to do.
As of last year, the hardware-based version of 3D Touch no longer exists; new iPhones do not have the component that registers touch pressure, and iPads never did. It’s kind of interesting that Google decided that now was an ideal time to replicate in software the ability to detect pressure — something which, as far as I can figure out, iOS does not do. I do not believe features like the context menu measure anything other than how long a finger has been touching the screen; I don’t think there’s a smush algorithm in iOS.