Brooks Barnes and Michael J. de la Merced, New York Times:
Comcast, the cable giant and owner of NBCUniversal, is in preliminary talks to buy entertainment assets owned by 21st Century Fox, including a vast overseas television distribution business, the Fox movie studio, the FX cable network and a group of regional sports channels.
Under the deal being discussed, the Murdoch family, which controls 21st Century Fox, would retain the Fox News cable network, certain sports holdings, a chain of local television stations and the Fox broadcast network.
Disney is also rumoured to be interested in these Fox assets, as is Sony. All of these companies are gigantic media conglomerates, Comcast being the largest in the United States, Disney being the second largest, and 21st Century Fox third.
One thing that’s absolutely critical to understand when considering questions about media ownership and net neutrality is that there are few major media companies that are in single lines of business. Increasingly, these conglomerates are becoming vertically integrated with unprecedented reach: they finance movies and television, distribute and market their programming, some provide the cable and internet services that transmit video to viewers’ computers and televisions, and many own or have major stakes in streaming platforms as well. So as the FCC contemplates dismantling net neutrality regulations, they are helping create a situation in which Comcast could conceivably own and prioritize their media assets from their production to your couch, while restricting competition. Imagine if heyday-era General Motors owned everything from steel mills to parts of the Interstate system, but instead of transportation, it’s information and entertainment.
I maintain that Comcast should never have been allowed to buy NBCUniversal. That kind of cross-market dominance is toxic for competition. A similar mistake should be avoided by blocking their purchase of 21st Century Fox’s entertainment businesses as well.
I did get the translation feature to work, by the way, and it’s just as confusing as everything else about the Pixel Buds. You’d think that you could just tap the right earbud and ask Google to translate what you’re hearing, but it’s more complicated than that. You do have to tap the earbud and ask Google to translate, but then you have to open up the Google Translate app and hold your phone in front of your foreign language-speaking friend. And, of course, your phone must be a Google Pixel or Pixel 2.
The dream is to be able to have a relatively normal conversation with someone whose language you don’t speak, right? That’s clearly not what you get here. That’s a shame, because it’s something Google ought to be able to do very well — or, at least, that’s the promise of a company that mines the world’s data, isn’t it?
In June, Apple announced that it was challenging Amazon’s sleeper hit Amazon Echo with its own voice assistant-enabled speaker, called HomePod, and said the product would be released in December 2017. Today, the company released a statement that the speaker will be delayed until 2018: “We can’t wait for people to experience HomePod, Apple’s breakthrough wireless speaker for the home, but we need a little more time before it’s ready for our customers. We’ll start shipping in the US, UK, and Australia in early 2018.”
I’ve been trying to figure out why the HomePod was announced at WWDC in June at all instead of, say, during Apple’s more product-focused September keynote. My best guess is that it was a way to complete the story of SiriKit in a broader context and encourage adoption.
No word on the iMac Pro, by the way, which is still scheduled to begin shipping in December.
The head of the Federal Communications Commission is set to unveil plans next week for a final vote to reverse a landmark 2015 net neutrality order barring the blocking or slowing of web content, two people briefed on the plans said.
In May, the FCC voted 2-1 to advance Republican FCC Chairman Ajit Pai’s plan to withdraw the former Obama administration’s order reclassifying internet service providers as if they were utilities. Pai now plans to hold a final vote on the proposal at the FCC’s Dec. 14 meeting, the people said, and roll out details of the plans next week.
The FCC is currently in Republican hands; today, they voted to lift regulations that prevent broadcasters and newspapers from common ownership in the same market. According to Shepardson, the FCC also plans to vote in December to lift rules preventing any single media company from owning television stations reaching 39% of households. The cumulative effect of this push to lift sensible regulations will likely be catastrophic for independent media and diverse viewpoints. It fundamentally rots the very idea of a free and independent press, and is ruinous for a healthy democracy.
It’s worth pointing out that rescinding net neutrality regulations is not what Americans want. Jon Brodkin, Ars Technica:
The FCC voted in May to take public comment on a preliminary proposal to overturn the 2015 net neutrality order. With the public comment period now over, Pai is free to push through a final vote.
The public comments were dominated by spam and form letters, but a study funded by ISPs found that 98.5 percent of unique comments were written by people who want the FCC to leave the rules in place.
Statistically, if you’re American, you favour preserving these regulations. Ajit Pai and the other Republican commissioners at the FCC are currently planning to vote against the will and want of an overwhelming majority of Americans. That’s outrageous.
iOS 11.2 is currently in beta, and will be released to all iPhone and iPad users in the coming weeks, and one of the key features for iPhone 8/8 Plus/X owners is accelerated wireless charging. Previously, all wireless charging was limited to 5W, but this update will raise that limit to 7.5W. That’s a 50% increase in power on paper, but I had to know what the real world difference was.
The only place I’m considering using one of these inductive charging pads is on my desk at work, because I still use wired headphones because I can’t find a pair of wireless headphones that I like. But I’m having a hard time justifying the expense for what is effectively a glorified trickle charger, especially since battery life with my iPhone X has been fantastic.
Update: I’ve heard that 7.5W charging is only supported on certain charging bases; as far as I can figure out, that’s limited right now to the Mophie and Belkin ones that are sold through Apple’s online store. Both of those charging bases carry a note like this:
High-speed wireless charging
Leverages Qi wireless technology to deliver safe, quick-charging speeds with up to 7.5W of power.
As Federico Viticci writes, the Qi standard supports up to 15W, so I’m not sure why the third beta of iOS 11.2 unlocks only up to 7.5W, nor do I understand why only specific base stations will apparently support this faster charging rate.
This morning, a few publications ran with a holiday-themed data study about how families that voted for opposite parties spent less time together on Thanksgiving, especially in areas that saw heavy political advertising. It’s an interesting finding about how partisan the country is becoming, and admirably, the study’s authors tried to get data that would be more accurate than self-reporting through surveys. To do this, they tapped a company called SafeGraph that provided them with 17 trillion location markers for 10 million smartphones.
The data wasn’t just staggering in sheer quantity. It also appears to be extremely granular. Researchers “used this data to identify individuals’ home locations, which they defined as the places people were most often located between the hours of 1 and 4 a.m.,” wrote The Washington Post.
SafeGraph was also able to use their data to state that attendees at Donald Trump’s inauguration had lower household incomes than those attending the Women’s March the following day which, regardless of whether you believe it, is a deeply creepy claim.
I have no idea which apps share my data with SafeGraph because I grant so many apps approval to share collected information with third parties, with no mention of what those third parties may be. I don’t like that I have seemingly no control over this; blanket approval statements are pretty standard in privacy policies on websites and in apps, and they need to be stopped. I did not explicitly give permission for my data to be shared with a creepy location tracking company, and it’s completely unfair to assume that it’s okay.
For what it’s worth, iOS should also request explicit permission to enable ad tracking. It is presently allowed by default — at least in Canada — and users must opt out in Settings.
Stevan Dojcinovic, in an op-ed for the New York Times, reacting to the fallout from Facebook’s announcement last month that they would move unpaid news stories from pages into a separate News Feed in some countries:
It wasn’t just in Serbia that Facebook decided to try this experiment with keeping pages off the News Feed. Other small countries that seldom appear in Western headlines — Guatemala, Slovakia, Bolivia and Cambodia — were also chosen by Facebook for the trial.
Some tech sites have reported that this feature might eventually be rolled out to Facebook users in the rest of the world, too. But of course no one really has any way of knowing what the social media company is up to. And we don’t have any way to hold it accountable, either, aside from calling it out publicly. Maybe that’s why it has chosen to experiment with this new feature in small countries far removed from the concerns of most Americans.
But for us, changes like this can be disastrous. Attracting viewers to a story relies, above all, on making the process as simple as possible. Even one extra click can make a world of difference. This is an existential threat, not only to my organization and others like it but also to the ability of citizens in all of the countries subject to Facebook’s experimentation to discover the truth about their societies and their leaders.
It’s pretty astonishing that an experiment like this would be announced around the same time that Facebook is being questioned about the possible role that misleading targeted ads may have played in the 2016 U.S. Presidential election. There’s no indication yet just how influential these ads were on specific voters or the election itself, but if they had even a slight sway in a developed democracy like that in the U.S., just imagine how influential highly-targeted ads may be in newer and, usually, weaker democracies. Facebook’s careless U.S.-centric attitude is frightening from this non-American’s perspective.
A small quibble with Dojcinovic’s piece: its headline is “Hey, Mark Zuckerberg: My Democracy Isn’t Your Laboratory”, and he refers to “Mark Zuckerberg’s arbitrary experiments”. I think ascribing the actions of a company to its notable figureheads is unproductive as I feel that it reduces a concerning issue of egregious corporate influence and accountability to a personal spat.
I’ve been using my iPhone X for nearly a week now and, while I have some thoughts about it, by no means am I interested in writing a full review. There seem to be more reviews of the iPhone X on the web than actual iPhone X models sold. Instead, here are some general observations about the features and functionality that I think are noteworthy.
The iPhone X is a product that feels like it shouldn’t really exist — at least, not in consumers’ hands. I know that there are millions of them in existence now, but mine feels like an incredibly well-made, one-off prototype, as I’m sure all of them do individually. It’s not just that the display feels futuristic — I’ll get to that in a bit — nor is it the speed of using it, or Face ID, or anything else that you might expect. It is all of those things, combined with how nice this product is.
I’ve written before that the magic of Apple’s products and their suppliers’ efforts is that they are mass-producing niceness at an unprecedented scale. This is something they’ve become better at with every single product they ship, and nothing demonstrates that progress better than the iPhone X.
It’s such a shame, then, that the out-of-warranty repair costs are appropriately high, to the point where not buying AppleCare+ and a case seems downright irresponsible. Using the iPhone X without a case is a supreme experience, but I don’t trust myself enough to do so. And that’s a real pity, because it’s one of those rare mass-produced items that feels truly special.
This is the first iPhone to include an OLED display. It’s made by Samsung and uses a diamond subpixel arrangement, but Apple says that it’s entirely custom-designed. Samsung’s display division is being treated here like their chip foundry was for making Apple’s Ax SoCs.
And it’s one hell of a display. It’s running at a true @3x resolution of 458 pixels per inch. During normal use, I can’t tell much of a difference between it and the 326 pixel-per-inch iPhone 6S that I upgraded from. But when I’m looking at smaller or denser text — in the status bar, for example, or in a long document — this iPhone’s display looks nothing less than perfect.
One of the reasons this display looks so good is because of Apple’s “True Tone” feature, which matches the white balance of the display to the environment. In a lot of indoor lighting conditions, that’s likely to mean that the display is yellower than you’re probably used to. Unlike Night Shift, though, which I dislike for being too heavy-handed, True Tone is much subtler. Combine all of this — the brightness of the display, its pixel density, its nearly edge-to-edge size, and True Tone — with many of iOS’ near-white interface components and it really is like a live sheet of paper in your hand.
Because it’s an OLED display that has the capability of switching on and off individual pixels, it’s only normal to consider using battery-saving techniques like choosing a black wallpaper or using Smart Invert Colours. I think this is nonsense. You probably will get better battery life by doing both of those things, but I’ve been using my iPhone X exactly the same as I have every previous phone I’ve owned and it gets terrific battery life. Unless you’re absolutely paranoid about your battery, I see no reason in day-to-day use to treat the iPhone X differently than you would any other phone.
I’m a total sucker for smaller devices. I’d love to see what an iPhone SE-sized device with an X-style display would be like.
Face ID is, for my money, one of the best things Apple has done in years. It has worked nearly flawlessly for me, and I say that with no exaggeration or hyperbole. Compared to Touch ID, it almost always requires less effort and is of similar perceptual speed. This is particularly true for login forms on the web: where previously I’d see the Touch ID prompt and have to shuffle my thumb down to the home button, I now just continue staring at the screen and my username and password are just there.
I’m going to great pains to avoid the most obvious and clichéd expression for a feature like this, but it’s apt here: it feels like magic.
The only time Face ID seems to have trouble recognizing me is when I wake up, before I’ve put on my glasses. It could be because my eyes are still squinty at the time and it can’t detect that I’m looking at the screen, or maybe it’s just because I look like a deranged animal first thing in the morning. Note, though, that it has no trouble recognizing me without my glasses at any other time; however, I first set up Face ID while wearing my glasses and that’s almost always how I use it to unlock my phone. That’s how it recognizes me most accurately.
Last week, I wrote that I found that there was virtually no learning curve for me to feel comfortable using the home indicator, and I completely stand by that. If you’ve used an iPad running iOS 11, you’re probably going to feel right at home on an iPhone X. My favourite trick with the home indicator is that you can swipe left and right across it to slide between recently-used apps.
Arguably, the additional space offered by the taller display is not being radically reconsidered, since nearly everything is simply taller than it used to be. But this happens to work well for me because nearly everything I do on my iPhone is made better with a taller screen: reading, scrolling through Twitter or Instagram, or writing something.
The typing experience is, surprisingly, greatly improved through a simple change. The keyboard on an iPhone X is in a very similar place to where it is on a 4.7-inch iPhone, which means that there’s about half an inch of space below it. Apple has chosen to move the keyboard switching button and dictation control into that empty space from beside the spacebar, and this simple change has noticeably improved my typing accuracy.
In a welcome surprise, nearly all of the third-party apps I use on a regular basis were quickly updated to support the iPhone X’s display. The sole holdouts are Weather Line, NY Times, and Spotify.
I have two complaints with how the user interfaces in iOS work on the iPhone X. The first is that the system still seems like it is adapting its conventions to fit bigger displays. Yes, you can usually swipe right from the lefthand edge of the display to go back to a previous screen, but toolbars are still typically placed at the top and bottom of the screen. With a taller display, that means that there can be a little more shuffling of the device in your hand to hit buttons on opposite sides of the screen.
My other complaint is just how out of place Control Centre feels. Notification Centre retains its sheet-like appearance if it’s invoked from the left “ear” of the display, but Control Centre opens as a sort of panelled overlay with the status bar in the middle of the screen when it is invoked from the right “ear”. The lack of consistency between the two Centres doesn’t make sense to me, nor does the awkward splitting of functionality between the two upper corners of the phone. It’s almost as though it was an adjustment made late in the development cycle.
I don’t know what the ideal solution is for the iPhone X. Control Centre on the iPad is a part of the multitasking app switcher, and that seems like a reasonable way to display it on the iPhone, too. I’m curious as to why that wasn’t shipped.
Cameras and Animoji
This is the first dual-camera iPhone I’ve owned so, not only do I get to take advantage of technological progress in hardware, I also get to use features like Portrait Mode on a regular basis. Portrait Mode is very fun, and does a pretty alright job in many environments of separating a subject from its background. Portrait Lighting, new in the iPhone 8 and iPhone X, takes this one step further and tries to replicate different lighting conditions on the subject. I found this to be much less reliable, with the two spotlight-style “stage lighting” modes to be inconsistent in their subject detection abilities.
The two cameras in this phone are both excellent, and the sensor captures remarkable amounts of data, especially if you’re shooting RAW. Noise is well-controlled for such a small sensor and, in some lighting conditions, even has a somewhat filmic quality.
I really like having the secondary lens. Calling it a “telephoto” lens is, I think, a stretch, but its focal length creates some nice framing options. I used it to take a photo of my new shoes without having to get too close to the mirror in a department store.
Animoji are absurdly fun. The face tracking feels perfect — it’s better than motion capture work in some feature films I’ve seen. I’ve used Animoji more often as stickers than as video messages, and it’s almost like being able to create your own emoji that, more or less, reflects your actual face. I only have two reservations about Animoji: they’re only available as an iMessage app, and I worry that it won’t be updated regularly. The latter is something I think Apple needs to get way better at; imagine how cool it would be if new iMessage bubble effects were pushed to devices remotely every week or two, for example. It’s the same thing for Animoji: the available options are cute and wonderful, but when Snapchat and Instagram are pushing new effects constantly, it isn’t viable to have no updates by, say, this time next year.
I mentioned above that I bought AppleCare+ for this iPhone. It’s the first time I’ve ever purchased AppleCare on a phone, and only the second time I’ve purchased it for any Apple product — the first was my MacBook Air because AppleCare also covered the Thunderbolt Display purchased around the same time. This time, it was not a good buying experience.
I started by opening up the Apple Store app, which quoted $249 for AppleCare+ for the iPhone X. I tapped on the “Buy Now” button in the app but received an error:
Some products in your bag require another product to be purchased. The required product was not found so the other products were removed.
As far as I can figure out, this means that I need to buy an iPhone X at the same time, which doesn’t make any sense as the Store page explicitly says that AppleCare+ can be bought within sixty days.
I somehow wound up on the check coverage page where I would actually be able to buy extended coverage. After entering my serial number and fumbling with the CAPTCHA, I clicked the link to buy AppleCare. At that point, I was quoted $299 — $50 more than the store listing. I couldn’t find any explanation for this discrepancy, so I phoned Apple’s customer service line. The representative told me that the $249 price was just an estimate, and the $299 price was the actual quote for my device, which seems absurd — there’s simply no mention that the advertised price is anything other than the absolute price for AppleCare coverage. I went ahead with my purchase, filling in all my information before arriving at a final confirmation page where the price had returned to $249, and that was what I was ultimately charged.
It’s not the $50 that troubles me in this circumstance, but the fact that there was a difference in pricing at all between pages on Apple’s website. I don’t know why I was ever shown a $299 price, nor do I understand why I’m unable to use the Apple Store app to purchase AppleCare+ for my iPhone X using my iPhone X.
The world has lots of very stupid ideas in it. One of them, one of the most harmful, is the prevailing idea of what it means for one thing to be technologically superior to another. Only a culture sunken to a really frightening and apocalyptic level of libertarian stupidity would regard the Keurig machine — a sophisticated, automated robot designed specifically and only to brew a single serving of coffee, rather than a big efficient pot of it; which presents only illusory ease and convenience only to whoever is using it at the moment of his or her use and to no one else, and only via fragile technologized mediations it wears atop its primary function like an anvil, or a bomb collar; which can be rendered literally unusable by the breakdown of needless components completely ancillary to that primary function — as a technological improvement upon the drip coffeemaker, or the French press, or putting some coffee grounds in a fucking saucepan with some water and holding it over a campfire for a little while until the water smells good. It is not technologically superior to any of those! It is vastly technologically inferior to all of them. It is a wasteful piece of trash. It is not a machine engineered to improve anything or to resolve a problem, but only and entirely the pretext for a sales pitch, a means to separate someone from their money.
Two things that Burneko does not cover in his otherwise comprehensive explanation of a Keurig machine’s failings: dosage and price per pound. Let’s start with dosage.
A K-Cup pod contains somewhere between 9 and 13 grams of coffee grounds. The coffee I make is a bit stronger than most people make, but it’s nowhere near knock-your-head-off territory; even so, I use about 20–22 grams of beans per cup in my AeroPress and follow a method similar to Kaye Joy Ong’s. But even if you like your coffee a little closer to average, you have to fall a long way to get to nine measly grams of beans. That and a Keurig’s low brewing temperature go a long way towards explaining why every cup of Keurig coffee I’ve ever had tastes like laundry water.
And then there’s the price of all of this — up to $50 per pound. There is almost nowhere on Earth you can’t get better coffee shipped to your door for less than $50 per pound. The Keurig is an utterly absurd way to brew expensive instant coffee not very well.
Update: It turns out that some fans of Sean Hannity are destroying their Keurig machines in a bizarre protest that they think offends liberals. This post has absolutely nothing to do with that. For extra credit, reflect on how absurd this update truly is.
Pictures and text often pair nicely together. You have an article about a thing, and the picture illustrates that thing, which in many cases helps you understand the thing better. But on the web, this logic no longer holds, because at some point it was decided that all texts demand a picture. It may be of a tangentially related celeb. It may be a stock photo of a person making a face. It may be a Sony logo, which is just the word SONY. I have been thinking about this for a long time and I think it is stupid. I understand that images —> clicks is industry gospel, but it seems like many publishers have forgotten their sense of pride. If a picture is worth a thousand words, it’s hard for me to imagine there’ll be much value in the text of an article illustrated by a generic stock image.
The Outline is, of course, also a contributor to this trend. A photo of Mark Zuckerberg leads this story about Facebook’s dumb-as-bricks idea to combat revenge porn — which, incidentally, is almost exactly one of O’Haver’s examples. A great article about Twitter’s inconsistent character limit for those using accessibility features is illustrated, for some reason, by an old-timey photo of a man using a Monotype keyboard.
At some point in the past several years, the millions of different possibilities of turning individual pixels into a website coalesced around a singularly recognizable and repeatable form: logo and menu, massive image, and page text distractingly split across columns or separated by even more images, subscription forms, or prompts to read more articles. The web has rapidly become a wholly unpleasant place to read. It isn’t the fault of any singular website, but a sort of collective failing to prioritize readers.
I don’t know about you, but I’ve become numb to the web’s noise. I know that I need to wait for every article I read to load fully before I click anywhere, lest anything move around as ads are pulled in through very slow scripts from ten different networks. I know that I need to wait a few seconds to cancel the autoplaying video at the top of the page, and a few more seconds to close the request for me to enter my email and receive spam. And I know that I’ll need to scroll down past that gigantic header image to read anything, especially on my phone, where that image probably cost me more to download than anything else on the page.
These photos add nothing but hundreds of kilobytes to the story. They can easily be replaced with pictures of William Howard Taft with little consequence. It’s just another reason why full-text RSS feeds continue to be one of the best ways to read a website’s articles.
Earlier this week, I noted on Twitter that I thought that one of Apple’s biggest misses when they released the iPhone 4 was not including a version of Photo Booth. Photo Booth was a huge deal for the Mac when it was included with new Macs that had the built-in iSight camera. Imagine if Apple had released a version of it for the iPhone at any point in the past six years and updated its built-in filters weekly. I think it would have been extremely popular.
Well, they’ve kind of done that with the second version of Clips, their quick little video editing app. I wasn’t enthralled with it when it was first released and, as far as I could tell, neither were most people.
But this new version is exciting. Apple has completely redesigned the app so it’s way easier to use, and they’ve added a new Scenes feature to allow you to virtually change your environment. Fans of Photo Booth might remember Backdrops; Scenes is like that, only far more reliable — I bet it uses ARKit — and with way cooler effects. You can place yourself into a futuristic metropolis, outer space, or even into Star Wars locations.
Clips 2.0 is still too complicated to feel as lightweight and fun as Photo Booth; Snapchat and Instagram — and, to an extent, Animojis — have that market cornered. I’d like to see Clips receive more frequent updates, but there’s something good here that’s absolutely worth checking out if you haven’t tried Clips recently.
You’ve read Steven Levy’s tour of Apple Park, and you’ve read Christina Passariello’s for the Wall Street Journal. But Apple is still putting the finishing touches on the building so they invited Nick Compton of Wallpaper to take a look as well. There is, of course, fantastic photography by Mark Mahaney in this article, but I think this bit — about the iPhone X — profound:
The most advanced iteration of the iPhone, the X, launched with great hoopla at the keynote address, is all screen. Except that’s the wrong way to look at it. The point is that, at least in the way we use it and understand it, it is entirely unfixed and fluid.
I wonder, then, if Ive misses the physical click and scroll of the first iPods, that fixed mono-functionality, the obvious working parts, the elegance of the design solution. But I’ve got him all wrong. ‘I’ve always been fascinated by these products that are more general purpose. What I think is remarkable about the iPhone X is that its functionality is so determined by software. And because of the fluid nature of software, this product is going to change and evolve. In 12 months’ time, this object will be able to do things that it can’t now. I think that is extraordinary. I think we will look back on it and see it as a very significant point in terms of the products we have been developing.
‘So while I’m completely seduced by the coherence and simplicity and how easy it is to comprehend something like the first iPod, I am quite honestly more fascinated and intrigued by an object that changes its function profoundly and evolves. That is rare. That didn’t happen 50 years ago.’
The pitch of the first iPhone was that the fixed plastic keyboards of the BlackBerry, et al., were unchangeable buttons that were there whether you needed them or not. All of that was replaced with an onscreen keyboard, when needed, and a singular “home” button. But, when viewed in the light of only displaying what is necessary, it is striking how — in just ten years — the home button has been reduced to the same level as those plastic keyboards: a fixed button that is there no matter whether it is needed. Nearly the entire user-facing surface of the iPhone X is now as flexible as the bezel-surrounded 3.5-inch display of that original iPhone.
For several years now, the trend among geeks has been to abandon the RSS format.
Has it, though? Sparks doesn’t cite anything to back this up. I’ve seen the occasional tech writer indicate that links surfaced through Twitter are equating, to a certain extent, those found in their RSS subscriptions, and others who see Twitter as increasingly replacing their RSS diet. But to call it a “trend” is, I think, an exaggeration.
I love this argument that Sparks makes, though:
That was never me. The reason I’ve stuck with RSS is the way in which I work. Twitter is the social network that I participate in most and yet sometimes days go by where I don’t load the application. I like to work in focused bursts. If I’m deep into writing a book or a legal client project. I basically ignore everything else. I close my mail application, tell my phone service to take my calls, and I definitely don’t open Twitter. When I finish the job, I can then go back to the Internet. I’ll check in on Twitter, but I won’t be able to get my news from it. That only works if you go into Twitter much more frequently than I do. That’s why RSS is such a great solution for me. If a few days go by, I can open RSS and go through my carefully curated list of websites and get caught back up with the world.
I can’t remember who, but someone once gave me the best tip I’ve ever received for using RSS: subscribe to your must-read websites, and those websites you like but aren’t updated frequently. It prevents your reader from quickly becoming overwhelming.
Truly, though, this isn’t a case for RSS so much as it is a case for a simple, easy-to-use way to receive updates from the websites you trust and like most. You could theoretically replace “RSS” with “JSON Feed” or “Twitter lists” — whatever works best for you. For news junkies like me, though, there will always be a case for dedicated feeds, without the interruption of non-news tweets or Facebook posts. RSS just happens to be one of the simplest implementations of that.
Most people, if they know Lamarr at all, remember her as an exotic beauty who starred in such movies as “Algiers” (1938) with Charles Boyer, and “Come Live with Me” (1941) opposite James Stewart. But behind those lips and those eyes was the brain of an untrained scientist who, after a long day on the MGM lot, would come home and invent things for pleasure. As one of many screen beauties who dated the eccentric aviator Howard Hughes, Lamarr devised rounded (rather than squared-off) wings for a super-fast plane Hughes was designing. Hughes was so impressed that he set Lamarr up with a mini-laboratory in her house.
Today would have been Lamarr’s 103 birthday. A film about her life and legacy — “Bombshell” — is being screened at the Boston Jewish Film Festival running now, and will be released in select theatres November 24.
Equifax has quadrupled spending on security, updated its security tools and changed its corporate structure since the breach, Paulino do Rego Barros Jr., the interim chief, said during a hearing by the Senate Commerce Committee.
But Mr. Barros stumbled when asked by Sen. Cory Gardner (R., Colo) whether Equifax was now encrypting the consumer data it stored on its computers — a basic step in hiding sensitive information from hackers, and one the company previously had admitted it didn’t take before the breach.
“I don’t know at this stage,” Mr. Barros said.
Before this catastrophic breach, your passcode-protected iPhone was more hardened against physical data access than every American’s credit information. Now, who knows? It may still be better-protected.
This is irresponsible to the point of negligence. I sincerely hope criminal charges are brought against Equifax for the results of their indifference towards basic security practices; if no criminal charges apply, it ought to trigger a process to ensure that new laws get written to hold companies accountable for inadequate protection of customer data.
In other news, Equifax reported their quarterly earnings today. Stephen Gandel of Bloomberg:
Equifax’s ability to increase its operating earnings during one of the most disastrous quarters, at least operationally and reputationally, in its history, or the history of most companies, really, attests to how entrenched the business is in the financial system. That will most likely add to the frustration of consumers and their advocates.
All that is probably why Equifax’s stock, which plunged initially after the hack, has rebounded some and been fairly steady. Shares closed at just less than $109 on Thursday before the company announced its results. That’s down from the $143 they were trading at before the hack, but up from the $94 they sank to two days after the hack was disclosed. The stock is amazingly down only 8 percent this year. What’s more, it has a price-to-earnings ratio of 18 times next year’s earnings. That’s not a P/E ratio of a company in jeopardy but one that investors think is highly valued and growing. By comparison, Apple Inc. has a similar P/E of 15.
That’s a hell of an “if” to predicate this entire article on. I did not want to have to deal with two Diaz articles today — one is often enough — but, luckily, the Macalope dismantled that “if”.
So, now that all of the air has been taken out of Diaz’s argument, what is his argument?
You’re looking at a UX disaster, the result of eliminating what is probably the simplest, most intuitive form of navigation ever implemented in consumer electronics: the iPhone’s home button. The iPhone X replaces it with the mess above. This is bad news, because this interaction is a fundamental part of the user experience.
The home button was and is, indeed, a brilliant piece of user interface design. But don’t pretend that it’s completely simple and intuitive; pressing the home button is used to show the multitasking app switcher, access Siri, dismiss Notification Centre and Control Centre, take screenshots, activate accessibility features, invoke Reachability, and more. Oh, and it’s also used to return to the home screen. Lots of functionality has been packed into that little button.
Joanna Stern’s review for the Wall Street Journal – which still concludes that, “Yes, There Are Reasons to Pay Apple $1,000” – documents what this means in detail: “[T]he lack of a home button means your thumb is about to turn into one of those inflatable waving tube-men outside the car dealership […] you must master a list of thumb wiggles, waves and swipes […] the other gestures, however, are buried. Many moves require almost surgical precision.” Heather Kelly, for CNN Money, adds her own experience: “To fill the void left by the Home button, the iPhone X has added new gestures (the different swipes you make with a finger). The process of learning them is a pain, and some of the new options are more work than before.” The Verge declared that “there’s a whole new system of gestures and swipes to learn and master, and many of them will be annoying to remember and difficult to perform with just one hand.”
Diaz doesn’t link to any of these articles, and for good reason: it’s a rubbish argument. Joanna Stern praises the home button swipe in her piece, and the entirety of her criticisms are quoted by Diaz. She doesn’t make a big deal out of it, likely because her review was published just a day after she received her review unit. Heather Kelly was more muted in her first impressions than many reviewers, but she “[doesn’t] doubt anyone’s ability to master a few new finger movements”.
If you want to switch apps, you either swipe along the bottom of the screen or swipe up and hold — you’ll get a little haptic bump and the app switcher will show up. It took a minute to figure out how to do that move consistently. It took me a little longer to figure out how to consistently use Reachability.
I got my iPhone X last night. The idea that there’s some sort of steep learning curve to this thing is, I think, preposterous. Yeah, there are some decade-old habits I have to break, like when I moved an app around on my home screen this morning and tried pressing on a non-existent home button instead of tapping the “done” button in the upper-right. But the home indicator strip feels completely natural. It’s a testament to the speed and responsiveness of the device and its UI that these gestures feel as smooth and predictable as pinch-to-zoom did on the first iPhone.
Do you have to learn some new stuff? Sure. Will it take a little bit to get accustomed to the device? Absolutely. Is it a “nightmare”, as Diaz frames it in this article’s headline? Hardly.
Back to Diaz:
We knew this was coming, but the reviews and the sudden spike in “how to navigate your iPhone X” tutorials puts a new spotlight on the interaction problems that the elimination of the home button created.
No, it puts a spotlight on websites that really want to cash in on some sweet Google rankings by content farms. There’s a brief three-screen guide when you first set up an iPhone X that demonstrates how to use the home indicator. Once you get used to it, it feels completely natural, particularly if you’ve used an iPad running iOS 11.
Diaz spends another few hundred words quoting writers who made their explanations of other iOS gestures overly complicated, quoting Steve Jobs — hey, remember when people who generally liked using Apple products were Steve Jobs “fanboys”? Times sure have changed — and looking through rose-tinted glasses at the history of the iPhone.
I can’t make Diaz change his mind, no matter how ridiculous his arguments. He thinks iOS 11 “sucks” because UI elements in a few apps are misaligned, that the iPhone X is an egregious excess, and that the replacement of the home button with a handful of gestures makes the device a failure. This is the molehill he wants to die on.
My favourite thing about the release of a well-received Apple product is that there’s a great new product on the market — ideally, they’ve set a new benchmark. My second favourite thing is all the piss-poor takes from the usual suspects, like John C. Dvorak writing in PC Magazine:
The first round of iPhone X reviews are out, and a number of them came from a strange place: amateur YouTubers.
As of November 1, when this piece was published, Apple’s new PR strategy had already been picked apart and scrutinized in excellent pieces from Christina Bonnington and Matt Alexander, among many others. It’s already played out. What can Dvorak possibly contribute? Well, after several paragraphs about how YouTube is new and hip with the youth, he arrives at:
Perhaps Cupertino senses that iPhone X may end up like Microsoft Vista: unfairly criticized.
Windows Vista was too long in the making, removed a litany of features, was too slow on most hardware, was a bloated mix of new ideas and legacy code, and didn’t have nearly enough of the innovative features that were announced years before it was launched. There are forged paintings with a greater attention to historical accuracy than Dvorak demonstrates by calling criticisms of Vista “unfair”.
Chief on my list of complaints is the death of what my son calls The Magic Circle.
Get your crystals and divining rods ready.
The Magic Circle has been around since Steve Jobs introduced the original iPod. On the iPhone, it took the form of the home button, but rounded edges and circles are a favorite design element for Apple; from selecting favorite artists and genres inside Apple Music to that massive spaceship campus.
This is just silly. The primary design and user interaction element of the iPhone was its touch screen. Yes, the home button was important, but the screen was clearly more important for the way that the device is actually used. Don’t believe me? Ask yourself whether you’d rather have an iPhone without a home button, or an iPhone without a multitouch display. There’s a good reason why Apple went with the former option.
The iPhone X is full of rounded edges; it just has one fewer circle on its face.
But it does not exist on the iPhone X. Not even a boot-up screen with ever-expanding circles. So if the iPhone X fails, can we blame the missing Magic Circle? Well, maybe not. A more likely culprit will be that $1,000 price tag.
If I wanted to stretch, I’d point out that the Face ID setup screens use circles extensively, as does its animation. But Dvorak changes tack in the second and third sentences here — apparently, circles are no longer all that important to the iPhone’s success or potential failure. It’s the price, dammit. But, while it is certainly higher than many smartphones, Apple doesn’t seem to think that it will be a problem. They’re forecasting an $84–87 billion October–December quarter, compared to $78 billion for the same period in 2016. Financial results aren’t inherently indicative of a product’s quality, but Apple isn’t forecasting a failure. This isn’t Apple’s Vista.
This week, multiple outlets reported on a Facebook pilot scheme that aims to combat revenge porn. In the program, users would send a message to themselves containing their nude images, which Facebook will then make a fingerprint of, and stop others from uploading similar or identical pictures.
The approach has many similarities with how Silicon Valley companies tackle child abuse material, but with a key difference—there is no already-established database of non-consensual pornography.
According to a Facebook spokesperson, Facebook workers will have to review full, uncensored versions of nude images first, volunteered by the user, to determine if malicious posts by other users qualify as revenge porn.
Now, you could make a reasonable argument that Facebook should err on the side of assuming that all images that are similar to pornographic images should be hidden from public view when they’re reported as revenge porn. I would make that argument, too. But it seems like Facebook has abdicated the responsibility of monitoring their platform for these abuses for a long time, and they’re having a hard time catching up.
Ultimately, it comes down to whether users can trust Facebook, and a recent survey conducted by Reticle Research and the Verge indicates that Americans simply don’t. Oh, and one more thing:
Zuck: They “trust me”
Zuck: Dumb fucks.
That transcript from over ten years ago will never fail to bite Mark Zuckerberg in the ass.
Marissa Mayer, who led Yahoo until she left earlier this year with a $260 million payout after the web giant was bought by Verizon, wasn’t able to tell senators how hackers were able to steal the company’s entire store of three billion user accounts during a breach in 2013.
Richard Smith, meanwhile, who retired earlier this year after the catastrophic data breach at credit agency Equifax, which affected more than 145 million Americans, couldn’t tell senators who was behind the attack.
I understand that these investigations take time, and that the people involved in these kinds of attacks try to cover their steps as best they can. What I don’t understand is how, even with prior knowledge, both Yahoo1 and Equifax2 failed to take appropriate and responsible measures. We’re allowed to click the “Install Later” button beside system updates all we want, with very few consequences; a major corporation handling unfathomable amounts of data cannot take that risk. So why did they?
Yahoo experienced several security breaches prior to the 2013 one that affected three billion accounts, and several after that as well. ↩︎
The glasses company is cleverly using the iPhone’s camera to take maps of people’s faces, and use that data to recommend styles of glasses that will best fit your face. It’s a step beyond the digital try on system the company has previously offered, where it would try to place a virtual pair of glasses on a picture to let you see how it looks.
I’ve always liked the styles Warby Parker has offered and I’ve been very pleased with the glasses I’ve ordered from them. But the purchasing process where I live is nowhere near as great as it is in the United States: their home try-on kit isn’t available here, and the only retail stores in Canada are both in Toronto.
This is an interesting first step, but I can’t wait to see if Warby Parker can really commit to augmented reality and offer a truly fantastic virtual try-on experience.
[…] After reading Alex Ross’s article about John Eliot Gardiner and Monteverdi, I went to Apple Music to listen to one of his recordings. The problem is that his ensembles are called The English Baroque Soloists and The Monteverdi Choir. So the number of results that come up when searching for “Gardiner Monteverdi” is stultifying. (Yes, Sir John has recorded a lot of albums.)
Sure, there are two Monteverdi albums in that list, but there is a lot more Bach. To make things worse, this search only returns 21 albums, whereas clicking on the name of the artist on one of these album pages – English Baroque Soloists, John Eliot Gardiner, & The Monteverdi Choir – returns nearly 100 albums. But none of these searches return all the recordings that he made with this ensemble.
Apple’s search engines in Music and Photos aren’t terrible, but they need some work to feel capable and powerful. As an example, if you begin searching for, say, Queens of the Stone Age and tap the suggestion Queens of the Stone Age in Artists, there’s only one result — Queens of the Stone Age. But you have to tap that result to get to their artist page, and that feels slow and cumbersome. If there’s only one result and it’s an exact match, it should just go to the artist page.
I also find Apple’s search functionality rather limited. In Photos, for instance, you can search by date, location, keyword, person, or even different objects automatically identified in the photos. But you cannot search by camera model or lens. I get that most people probably wouldn’t use this but, as a digital camera’s make and model is part of every file’s metadata, it almost seems like the kind of thing that requires more effort to omit from Photos’ search engine.
Users of the latest iOS 11.2 beta release received a surprise today in their Messages app picker: the long-awaited Apple Pay iMessage app has now arrived.
Only in the United States, at the moment.
Most of the details of this feature were announced at WWDC, but Christoffel shares additional notes, including all the different access points for peer-to-peer Apple Pay:
While opening the iMessage app to initiate all payments and requests may be the idealized workflow, Apple has included several alternative methods for starting a transaction. You can use Siri to send or request money by voice, using simple commands like ‘Send John $10’ or ‘Ask Federico to send me $10.’ Within the Contacts app, there’s now a Pay button alongside other contact options, which takes you into Messages and opens the Apple Pay app. Inside the Messages app, any message you receive that includes a dollar amount will have that amount underlined, indicating it includes a link to quickly open the Apple Pay app and make a payment. Apple is clearly aware that far more often friends and family send standard messages with the requested amounts included. Lastly, the QuickType keyboard can also serve as a shortcut to initiate a payment.
It makes total sense that this is an iMessage app, but the additional access points ought to help users discover the feature which, I think, is the biggest hurdle Apple faces against a competitor like Venmo.
Face ID is one of the hallmark features of the iPhone X. Using facial recognition, you can unlock your phone almost as quickly as if you had no device security enabled at all—all you have to do is stare at it. It’s convenient, and potentially more secure than a four- or six-digit passcode. And because your data is stored in the phone’s so-called secure enclave and not in the cloud (as Apple did with Touch ID’s fingerprint data), the impressively detailed digital map Apple makes of your face, and the more than 50 facial expressions it can recognize, are kept safe. For the most part.
At launch, facial recognition data from Face ID will only be used by Apple to unlock your phone—and animate a handful of goofy emoji characters called Animoji. However, Apple plans to allow third-party app developers access to some of the biometric data Face ID collects. And this has some privacy experts concerned, as Reuters reports.
A stunning twist.
Fun fact: that Animoji link goes to another Slate article with the title “Three reasons why Apple’s iPhone X animojis are worrisome.” Those three reasons are: they are so good that users will be encouraged to use them! in public! with audio! and that can be annoying; that they are so good that they will become a selling tool for the iPhone X; and that the author gets confused about the difference between the Face ID feature and iOS’ ARKit APIs. A distinction which, as it turns out, Bonnington buries in her ostensibly panic-inducing article:
Facial recognition is everywhere these days. It’s how Facebook suggests friends you should tag in photos, how Snapchat’s lenses so masterfully morph onto your face, and how Google Photos can so intelligently collect and organize photos of people you photograph often. Apple already uses facial recognition in its Photos app on iOS, too. But until now, these companies have kept their facial recognition data private. Allowing developers to access some of that data — even if it’s only a rough map of your face and facial expressions, not the full dataset it uses for biometric identification — is new, potentially scary territory.
This is a completely confused paragraph. There is a difference between facial feature identification — the kind that’s used by Snapchat for lenses, Facebook for suggesting faces to tag in photos, and variations of which are available in a bunch of GitHub repos — and recognition of specific faces, like Google and Apple use for notating specific people in photo libraries.
Apple uses a very sophisticated version of the latter to make Face ID work, which they’ve detailed in a security white paper. But the version of face tracking that’s available to developers is not to be confused with Face ID; it is more like an enhanced version of facial identification. But even that has Bonnington worried:
To use your facial data, developers must first ask your permission in their apps, and must not sell that information to other parties. Still, while it’s forbidden under Apple developer guidelines, privacy experts worry that developers might sell this data or use it for marketing or advertising purposes. (Imagine, if you will, an ad-supported gaming app that uses your current facial expression on your avatar. How valuable would it be for an advertiser to monitor what facial expressions you make as you watch their commercial in between rounds of gameplay?)
That would, indeed, be pretty valuable and deeply creepy. Privacy experts are right to be worried about the plausibility of a company using any kind of facial identification data for marketing purposes, and that’s why Apple has prohibited it. And, yeah, they’re going to have to be pretty vigilant about that.
But let’s not pretend that this is a brand new hypothetical concern that’s exclusive to the iPhone X. Theoretically, any app the user has granted permission for the camera could also target ads using one of those open source facial identification libraries I wrote about earlier — something which is, of course, also prohibited by Apple.
The thing that confuses me most about this piece is that Bonnington is a damn good writer. On the same day that this poorly-researched article was published, she also wrote a fantastic take on those YouTube hands-on videos of the iPhone X published Monday last week. Can’t win ’em all, I guess.
If you updated your iPhone, iPad, or iPod touch to iOS 11.1 and find that when you type the letter “i” it autocorrects to the letter “A” with a symbol, learn what to do.
Apple suggests creating a text replacement shortcut to swap the letter I for the letter i. Yeah, really. They also say that they’re going to fix this in an update soon.
This is an utterly ridiculous bug to have escaped Apple’s QA checks and beta testing amongst developers and a public pool. I understand that this seems like an overreaction to a relatively minor bug, but I wasn’t kidding when I wrote last month that input devices should always work. That goes for virtual input devices, too.
With the release of High Sierra and iOS 11 in September, Apple introduced a machine learning-based method to restrict the ability of retargeting scripts to track users across the web. Previously, Safari users could try to prevent this by only allowing cookies from websites the user had explicitly visited — this was the default setting in Safari. Unfortunately, mischievous providers of ad retargeting, like Criteo, figured out a workaround:
Here’s what happens: when visiting a site that includes Criteo’s scripts, a bit of browser sniffing happens. If it’s a Safari variant — and only Safari — Criteo rewrites the internal links on the page to redirect through their domain, […]
The user is then sent to their intended destination page, and Criteo’s cookies are allowed to be set. All that’s needed is that split-second redirect for the first link clicked on the site.
Safari’s new tracking prevention mechanism is supposed to prevent this sort of creepy — and, arguably, unethical — behaviour. So, has it worked? Well, here’s what Criteo said in their most recent earnings report:
We believe our solution for Safari users currently allows us to mitigate about half of the potential impact from ITP. In the third quarter, ITP had a minimal net negative impact on our Revenue ex-TAC of less than $1 million. Given our expectations of the roll out of Apple’s iOS11 and our coverage of Safari users, we expect ITP to have a net negative impact on our Revenue ex-TAC in the fourth quarter of between 8% and 10% relative to our base case projections for the quarter. We will continue to improve and deploy our solution for Safari users over the coming quarters.
It appears that there’s definitely some effect on the ability for Criteo’s shitty script to work, but they’re estimating that it’s still about 50% effective. Perhaps this is just petty of me, but I wish ITP reduced Criteo’s script to 0% efficacy. The lengths to which Criteo has gone to — and will go to, according to the last sentence of that quote — in order for them to track users is an indication that they aren’t following the spirit of users’ wishes.
I’m using Criteo as an example here, but AdRoll employs a similar technique. I think that both of these companies behave disreputably, and I hope Intelligent Tracking Prevention continues to improve so it can better protect Safari users.