Haunted by Data, Television News Edition wsj.com

Joe Flint, Wall Street Journal (Twitter workaround for paywall):

In a game largely sanctioned by TV-ratings firm Nielsen, television networks try to hide their shows’ poor performances on any given night by forgetting how to spell.

That explains the appearance of “NBC Nitely News,” which apparently aired on the Friday of Memorial Day weekend this year, when a lot of people were away from their TVs. The retitling of “NBC Nightly News” fooled Nielsen’s automated system, which listed “Nitely” as a separate show.

Hiding the May 26 program from Nielsen dramatically improved the show’s average viewership that week. Instead of falling further behind first-place rival “ABC World News Tonight,” NBC news narrowed the gap.

According to Flint, this is a not-uncommon practice for broadcast networks because higher average viewership numbers mean a higher commanding price for advertising.

Recall Maciej Cegłowski’s talk from the Strata + Hadoop World conference:

A more recent and less fictitious example is electronic logging devices on trucks. These are intended to limit the hours people drive, but what do you do if you’re caught ten miles from a motel?

The device logs only once a minute, so if you accelerate to 45 mph, and then make sure to slow down under the 10 mph threshold right at the minute mark, you can go as far as you want.

So we have these tired truckers staring at their phones, bunny-hopping down the freeway late at night.

Of course there’s an obvious technical countermeasure. You can start measuring once a second.

Notice what you’re doing, though. Now you’re in an adversarial arms race with another human being that has nothing to do with measurement. It’s become an issue of control, agency and power.

You thought observing the driver’s behavior would get you closer to reality, but instead you’ve put another layer between you and what’s really going on.

If they wanted to, Nielsen could fix this problem with more data. They could do fuzzy matches for similarly-named programs and manually verify viewership numbers for each show. But that’s just increasing the number of steps networks have to take in order to inflate their viewership averages — instead of “Nitely News”, it could become “Lester Holt’s Hour of Power”. Every time Nielsen makes an adjustment, major networks would simply adjust.

I’m becoming increasingly convinced that making adjustments to products based purely on analytics data is a futile exercise, and can encourage obviously obtuse behaviours:

NBC in 2015 persuaded almost a dozen of its local TV station affiliates to rerun “Nightly News” after 2 a.m. At the time, NBC said, it was focused “on ways to reach our audience when and how they want to be reached.”

A rival network thought otherwise and alerted NBC advertisers to the practice. After learning of the stunt, many advertisers cried foul. They told NBC whoever was watching the newscast at that hour wasn’t the kind of consumer they wanted to reach. NBC said it quickly discontinued the practice.

You’ve seen similar practices to this on the web, only worse and far more extreme. Remember the era of paginated articles, where you’d get halfway through and have to click through to view the second half? Those were all the rage for two reasons: first, an increase in ad impressions; second, an increase in page views. The latter is part of a pretty typical SEO trick, since Google reportedly boosts rankings for websites that see more page views in a single session. However, this creates a terrible experience for readers. And every time Google makes an adjustment to the way they rank websites, a bunch of people scramble to try to figure out how to game the data for better rankings.

None of this actually helps people, though. Collecting a bunch of user data and blindly following it doesn’t always help make a better product — in fact, it sometimes produces a worse product — and it’s a privacy nightmare.