From time to time the entire technology press corps gets together on Twitter, spends several hours live-tweeting the same event, and then writes a series of blog posts about how nothing important happened. This event is known as a Congressional hearing, and today we witnessed our final one of the year.
After months of polite deferrals, Sundar Pichai finally went before Congress on Tuesday, and over the course of three and a half hours, said as little as possible. The hearing before the House Judiciary Committee was defined, as had been the Facebook hearings before it, by the widespread befuddlement of our lawmakers.
It would be helpful to start from the premise that Google (and Facebook) siphon more than enough information on people’s online actions and habits in the real world. Ask Google to commit to collecting data only if people explicitly agree. (The default is often the opposite; user information like people’s searches, their physical location over time and websites they visit are collected by Google unless people explicitly tell Google to stop.) Ask Google (and Apple) to commit to auditing the data collection of all the apps people download on Android phones and iPhones and demand to know whether they sell location information. Let’s change the conversation from what tech companies do to what they need to stop or start doing about personal information.
To be clear, I don’t want to repeat the false idea that members of Congress are old luddites who aren’t able or willing to understand how tech companies work. Some members of Congress asked great questions on Tuesday. Some of them did not. This format, however, does not feel like a good way to decide public policy. The thorny topic of the power of big technology companies deserves much better than this from all sides.
Google CEO Sundar Pichai’s three and a half hour testimony before the House Judiciary Committee today — and the problem with congressional tech executive hearings — is perhaps best encapsulated by his brief exchange with Texas Rep. Ted Poe.
“I have an iPhone,” Poe said, brandishing the device for all to see. “If I go and sit with my Democratic friends over there, does Google track my movement?”
Pichai began to reply, explaining that the answer to Poe’s question really depends on a bunch of smartphone minutiae — location services, app settings, privacy configurations, etc. But before he could finish, Poe cut him off. “It’s a ‘yes’ or ‘no’ question,” he bellowed. (It wasn’t.)
The exchange is an exemplar of the disconnect, the frustrations, and the pointlessness of the past year’s parade of tech executive hearings. Congress calls for Silicon Valley to have its day in the DC hot seat; then the day comes, and instead we find it’s a booster seat. Or an opportunity for congressional yelling. Or executive evasiveness. And in any case, nothing much is accomplished.
Take Poe’s question. Its topic — data privacy and location tracking — is important, but the wording was unartful, and it revealed, immediately, a poor understanding of the workings of the technology to which it referred. Conversely, Pichai’s answer seemed to purposefully ignore the spirit of the question, focusing on semantics instead of a reasonable answer. (For example: “While I don’t know the particulars of your device, yes, many Google apps track granular location information.”) The end result? Nothing worthwhile.
As usual for public performances like these, the most telling moments of Pichai’s testimony came in the form of what he did not say. For example, he didn’t give an explicit indicator of the status of the company’s work on a search engine specifically for users in China, only stating that it was “exploratory”. Virtually the entire running time of the hearing was characterized by members of Congress grandstanding on their issues of choice, rather than using their time to ask thoughtful questions.
Included in the recently-released iOS 12.1.1 update are enhancements specifically for iPhone XR users. They can now change how long they must tap and hold on something for the Haptic Touch gesture to be invoked; and, iPhone XR users can now display the detailed view of a notification by touching and holding on one from the lock screen or Notification Centre.
Prior to this release, a third-party developer could perfectly copy the Haptic Touch experience in their own apps by setting up a long press gesture recognizer, that concludes with a haptic vibration. However, now that users can adjust the duration in this new Haptic Touch menu, a third-party app will not be able to stay in sync with the user’s preferences.
The supported API for 3D Touch allows apps to inherit the exact same behavior (including changes to 3D Touch Sensitivity) as Apple’s 3D Touch implementations, but an analogous system for Haptic Touch does not currently exist. We’ll be on the lookout to see if Apple adds a formal Haptic Touch developer API in the future.
I get why the iPhone XR has an LCD display, a single camera, and uses aluminum instead of stainless steel — all of these attributes seem like reasonable differences compared to the X and the XS line. But withholding 3D Touch is a confusing compromise.
3D Touch is far from ideal. It is horribly inconsistent and undiscoverable. Even Apple can’t seem to decide what it should do uniquely, per the iOS HIG:
Don’t make peek the only way to perform item actions. Not every device supports peek and pop, and some people may turn off 3D Touch. Your app should provide other ways to trigger item actions in situations like these. For example, your app could mirror a peek’s quick actions in a view that appears while touching and holding an item.
It is worth asking: if the same action is invoked by using 3D Touch as it is when the user simply taps and holds, then what is the clear and direct intent of 3D Touch?
However, I think it’s a feature that is made worse by its exclusion on the iPhone XR, where it is sort of replaced with Haptic Touch. Haptic Touch is like 3D Touch, except for all of the ways in which it is not. It works for the flashlight and camera buttons on the lock screen, invokes a trackpad from the onscreen keyboard’s space bar, and, as mentioned earlier, on notification bubbles. But it does not work in every place 3D Touch does: an app’s icon on the home screen does not display a menu when the user touches and holds on it, and the peek and pop gestures are unseen. It also does not have a specific developer API, meaning that there’s no way to target it specifically.
All of this means that Haptic Touch is perhaps even less discoverable than 3D Touch, and has very little in common with it.
For whatever reason 3D Touch was eliminated from the iPhone XR, the current lineup of iPhone products is quite strange: it is obviously present on the flagship XS and XS Max models, but even the 7 and 8 models that Apple is still selling sport 3D Touch displays. And it’s not just the iPhone that has suffered from poor uptake of depth-sensing features: recent versions of WatchOS have scaled back its use of Force Touch, and the iPad has never received anything like 3D Touch, despite having some touch-and-hold features without haptic feedback.
I’ve long been a staunch defender of 3D Touch — I use its features all the time, and it now feels strange to me when an iPhone does not have it. I would rather see continued investment on that front to establish consist guidelines for its use, and make it a more obvious part of the system. But if 3D Touch is truly on its way out, it should be a clean kill across the board. A piecemeal approach with a similar-but-not-quite-the-same feature on just one product is a confusing distraction.