These problems include a static or crackling sound that increases in loud environments and issues with active noise cancellation.
Apple said AirPods Pro made after October 2020 don’t have the problems.
As usual, Apple says that the affected units are only a “small percentage” of all AirPods Pro sets sold. However, I know people who are on their second, third, and even fourth set of replacements and who are experiencing the same issue. This service program is being introduced one year to the day after the release of AirPods Pro, and covers all affected units manufactured before this month — which means nearly all units sold are eligible if they have this issue. Kudos to Apple for replacing them for free and, hopefully, this issue will finally be put to bed. However, since these headphones are sealed objects, this problem will cause an awful amount of environmental waste.
The Canadian real estate company behind some of the country’s most popular shopping centres says it is suspending the use of cameras embedded in its mall directories while provincial and federal privacy commissioners investigate their usage.
Cadillac Fairview says they’ve been using facial recognition software in their mall directories since June to track shoppers’ ages and genders without telling them.
According to the report, the technology Cadillac Fairview used — known as “anonymous video analytics” or AVA — took temporary digital images of the faces of individuals within the field of view of the camera in the directory.
The report said the company also kept about 16 hours of video recordings, including some audio, which it had captured during a testing phase at two malls.
Cadillac Fairview said it used AVA technology to assess foot traffic and track shoppers’ ages and genders — but not to identify individuals.
But the commissioners said that wasn’t good enough and did not meet the standard for meaningful consent.
In context-free terms, that oversimplified disclaimer amounts to an advisory, sure. In the real world, I think we need to establish a clear boundary between surveillance for crime deterrence and image collection for analysis. Intentions matter just as much from a privacy perspective, especially since retailers’ incentives seem to be aligned with collecting far more data if it is to be used for analysis — if the practices of the behavioural advertising industry are anything to go by.
Yesterday, Tim Bradshaw and Patrick McGee of the Financial Times reported that Apple is ostensibly building a rival to Google’s search engine. You can find a syndicated copy of the article at Ars Technica. It left me scratching my head because it undermines its premise on two fronts: it seems to claim that Apple is surely building a true rival to Google’s search engine, and that Apple does not already have a search engine. The first claim does not seem to be substantiated, and the second seems to be contradicted by the article’s own reporting.
Let’s start with the headline:
Apple Develops Alternative to Google Search
“Develops” is a curious and ambiguous choice of word. It leaves the impression that Apple is either currently working on a true Google Search competitor, or that it has already built one. I am not sure which is the case; let’s find out. Here’s the lede:
Apple is stepping up efforts to develop its own search technology as US antitrust authorities threaten multibillion-dollar payments that Google makes to secure prime placement of its engine on the iPhone.
That indicates, to me, that this search engine is something new or more directly opposing Google’s efforts. But it is followed by this paragraph:
In a little-noticed change to the latest version of the iPhone operating system, iOS 14, Apple has begun to show its own search results and link directly to websites when users type queries from its home screen.
This seems to refer to Siri web suggestions that used to only display within the Safari address bar but are now in Spotlight. As far as I can tell, these are exactly the same suggestions but surfaced in a different place.
There are also keyword search suggestions in Spotlight. But tapping on any of those will boot you into the search engine of your choice — whichever you set in Safari preferences.
Both certainly point to Apple shipping a search engine today. It may not be a website with a list of links based on a query, but Google’s search engine is increasingly unlike that, too. So I am left with the impression that this is a service that currently exists, but then the article posits that it is merely a warm-up act:
That web search capability marks an important advance in Apple’s in-house development and could form the foundation of a fuller attack on Google, according to several people in the industry.
Here is where things become more speculative. Bradshaw and McGee make no reference to having any sources at Apple, only quotes from a handful of people in adjacent businesses. Maybe they have background information from people who are familiar with Apple’s efforts, but nothing is cited in this article. The claim that Apple is, perhaps, working on a direct competitor to Google’s web search engine appears to be nothing more than speculation about what Apple could do from people who believe that it is something Apple is doing. That position seems to be predicated on regulatory pressures and recent hires:
Two and a half years ago, Apple poached Google’s head of search, John Giannandrea. The hire was ostensibly to boost its artificial intelligence capabilities and its Siri virtual assistant, but also brought eight years of experience running the world’s most popular search engine.
“They [Apple] have a credible team that I think has the experience and the depth, if they wanted to, to build a more general search engine,” said Bill Coughran, Google’s former engineering chief, who is now a partner at Silicon Valley investor Sequoia Capital.
Apple’s interest in a search engine seems to be a regular rumour, but now that its contract with Google is attracting attention in the United States and United Kingdom, perhaps there is more substance this time around than in previous years. That raises more questions for me from an antitrust perspective: for example, would regulators who questioned the prominence of Siri on Apple’s devices find it equally dubious for the company to have its own search engine presumably set as the default?
Whatever the case, I am not sure this Financial Times piece sheds light on Apple’s path forward. The only substantive fact in this article is that Apple has expanded Safari’s Siri suggestions to Spotlight. Everything else appears to be speculative.
As with many congressional hearings, the point of this one wasn’t really to get answers, but sound bites. No one was readier to add to their sizzle reel than Ted Cruz, the Texas Republican who has done as much as anyone to promote anti-conservative bias as a political issue worthy of debate in Washington. Cruz, appearing remotely, lit into Dorsey for what he considers Twitter’s “egregious” conduct. “Mr. Dorsey,” he snarled, “who the hell elected you and put you in charge of what the media are allowed to report and what the American people are allowed to hear, and why do you persist in behaving as a Democratic Super PAC silencing views to the contrary of your political beliefs?”
The most notable thing about Cruz’s broadside was not its vituperative tone but the fact that it was directed at Dorsey and not the other two CEOs called to testify, Mark Zuckerberg and Sundar Pichai. Indeed, over the course of the hearing, Dorsey fielded more questions from Republicans than those two combined, according to a New York Times tally. And yet Facebook and Google are far more embedded in American life, and play a far greater gatekeeping role, than Twitter could ever dream of. Around 70 percent of American adults use Facebook and YouTube regularly, and Google accounts for some 90 percent of the general search market. Given their dramatically larger user bases, Facebook and Google are far more significant drivers of traffic to media sites. Almost all of my WIRED stories get most of their traffic from one of the two — most often Google, whose monopoly on search makes it the first place readers go to look up a given topic. Banning my stuff from Twitter would be rough, but banning it from Google would be close to wiping it out of existence.
There is a good-faith discussion to be had about Section 230 of the Communications Decency Act, but this was not it. It did not come close. For a brief moment, there was some discussion of Section 230 itself; however, Edelman’s description above nails the tone and substance of today’s hearing: nasty and almost none. Republicans were furious that Twitter made it slightly more difficult to find a discredited tabloid story and, I guess, they believe they have the power to intervene. Or maybe they are just generally angry about technology and wanted an outlet.
Again, all of these complaints were about Twitter, which is a fraction of the size and influence of Facebook, Google, Rupert Murdoch’s publishing and broadcast empire, or internet service providers. The latter have been using a lack of competition or regulation to exploit those working from home, but their CEOs aren’t testifying before Congress.
The award for honesty today went to Brian Schatz, who correctly stated that the hearing was a “sham” and “nonsense”, and refused to participate. It was otherwise a massive waste of everyone’s time, and Americans should demand better.
Talal Haj Bakry and Tommy Mysk, who you may remember from their work on pasteboard snooping, have a new report describing various ways generated link previews in chat apps can compromise privacy and security:
Link previews in chat apps can cause serious privacy problems if not done properly. We found several cases of apps with vulnerabilities such as: leaking IP addresses, exposing links sent in end-to-end encrypted chats, and unnecessarily downloading gigabytes of data quietly in the background.
Apparently, Facebook Messenger and Instagram will download link previews no matter the file size server-side — and, get this, Facebook says that this feature is working as intended. So, if a person wanted to be kind of an asshole to someone, they could just send a direct link to a massive image or video file over Instagram, and this is apparently something Facebook has no intention of fixing.
Also, there appear to be a handful of services redacted from this report. Stay tuned, I guess.
I was playing around with a few test photos shot on my new iPhone in Lightroom and this recently-introduced feature stood out to me: precise colour adjustments for shadows, midtones, and highlights. It is a replacement for the split toning tool and I think its interface is wonderful. It is less effective to explain than for you to just try it — I believe it is part of the free tool set in Lightroom for iOS.
Unlike Stephen Hackett, linked here, I do not plan on writing a full iPhone 12 Pro review. But, as I am upgrading from an iPhone X, you’ll forgive me for sharing a few first impressions since there are many new things.
Like Hackett, I also chose the “silver” model. A few people have suggested that its white back and bright metal frame look somewhat “cheap” compared to other colour options. I get where that idea comes from, but I disagree: the satin back glass looks and, more importantly, feels premium, while the pure stainless edges make it look like a wristwatch. Also, outside of the U.S., there isn’t a plastic mmWave window interrupting the metal.
That said, I bet the metal band would look fantastic if it were bead-blasted, somewhat like the iPhone 4. I hope someone with money to burn will give that a shot.
The size and weight differences are minor on paper compared to the iPhone X, but they are noticeable in the hand. I vastly prefer the flat metal sides in principle but, combined with the wider and taller body, I find one-handed use even harder than the already-difficult X. Perhaps it is just muscle memory but I know that I would prefer the size of the Mini model hands down. I am, unfortunately, a sucker for many lens options.
According to my kitchen scale, the 12 Pro is only fourteen grams heavier, but it is weightier in the hand. All of these factors make me grip this phone just a little tighter than the X. Again, perhaps it is unfamiliarity combined with New Product Dropophobia, but I feel more secure with smaller and lighter devices.
This was the first time I was able to use the device-to-device upgrade method and it seemed to fine, though both phones became very warm for the two-plus hours it took to move my packrat gigabytes. But I am not entirely sure what it migrated. Apple’s documentation offers few clues other than “all your data”, but that didn’t appear to be the case. Not all of my music was moved over and some of my playlist artwork was replaced by the cover artwork of Ariana Grande’s “Dangerous Woman”. My photo thumbnails appeared to be intact, but I see that my phone is currently one percent of the way towards reconciling iCloud Photos. Most app settings seem to have migrated, but the apps themselves needed to be downloaded — and, because I use several apps through TestFlight, the latter app needed to be installed first before any of those, so my home screen layout wasn’t exactly true to form. Best I can tell, it copied (most) app settings, all system settings, and my wallpaper. I’m not sure why that took over two hours, but it did not seem much faster than last time I restored from an iCloud backup.
I have not had an opportunity to test much of the new stuff. Coming from an iPhone X, the camera stuff is obviously the biggest leap: Night Mode, the ultra-wide camera, SF Camera typesetting, and better quality all around. I can already tell that the ultra-wide camera is going to be an oft-used addition to my creative palette. However, I was a bit saddened that RAW capture still is not enabled for the ultra-wide camera. When Apple previewed ProRAW, it said that the format would be available on all cameras, so I had baselessly hoped that plain RAW capture might be opened up to the ultra-wide. That does not appear to be the case. I am looking forward to ProRAW, as most of my creative iPhone photography workflow is RAW based.
The iPhone 12 Pro has double the RAM of my iPhone X, so a pleasing new feature is that using the Camera app — or even Halide in RAW capture mode — no longer kicks every other app out of memory.
I tried a couple of augmented reality things that take advantage of the LiDAR sensor. They seemed to track far better and with more accuracy than the pan-and-scan image mapping method. The phone still gets very warm to the touch, so I am sure this is the reason LiDAR is not used for focusing or Portrait mode in anything except low light conditions.
My provider does not offer 5G.1 I did not buy a case yet, nor any of the MagSafe accessories, because I already have a Lightning cable on my nightstand and it is fine. I know which way the winds are blowing but I am not looking forward to the day where MagSafe is a must but not included in the box and I have to sync a hundred-odd gigabytes of music from iTunes over WiFi. I am an edge case but, like many edge cases, it is important to me that I continue to do things in this outdated way.
It is a good phone. I am glad I upgraded, especially since my partner’s 6S expired recently without reasonable possibility of repair. But if I could create my dream iPhone, it would be the size and weight of the Mini with the camera system and RAM of the Pro — or, probably, the Pro Max. The battery life would probably be crap. This is one reason why I am not in charge of such matters.
Over LTE, in the most densely-populated area of Calgary — a little greater population density than the Castro in San Francisco, or the sixteenth arrondissement of Paris — my iPhone X peaked at 88 Mbps, while my iPhone 12 Pro peaked at 156 Mbps. ↩︎
You would think that, six years after its release, Apple would have patched up that whole “Songs of Innocence” debacle. But the tool that was created specifically for removing it from users’ accounts does not exist any more so it is still, apparently, an issue for some people. Ludicrous.
Of Apple and U2, you have got to wonder which party most regrets the way this album was forced into the public consciousness.
I don’t know if you noticed, but the visual design language that Apple introduced with iOS 7 has been around for longer on the iPhone than its Mac OS X-like predecessor. MacOS has twice been redesigned from top to bottom in the post-iOS 7 era, and Apple has introduced a few more operating systems in that timeframe. I am not sure there will be another wholesale redo of the iOS look-and-feel for years to come. But that does not mean there have not been subtle adjustments in that time. I want to focus on a tiny change in how Apple treats the edges of the screen because I think it is indicative of a slight evolution in how it treats the interaction of hardware and software on the iPhone.
iOS 7 was introduced in 2013. At that time, the Android market was making the shift from LCD displays to AMOLED. In fact, if you look at 2013 smartphone roundups, you’ll find an almost even split between models with LCDs and those with some flavour of OLED. But those early OLED displays weren’t great for wide viewing angles or colour accuracy, so every model in Apple’s iPhone and iPad lineup had an LCD display. Those differences materialized in radically different approaches to visual interface design.
In the preceding years, Google gradually shifted Android to darker design elements, culminating in the 2011 release of Android 4.0 with its “Holo” theme. While there were still some lighter elements, the vast majority of interface elements were rendered in white or electric blue on a black field. Since OLED displays control brightness at the pixel level, this created a high-contrast look while preserving battery life. This was important as Android device makers shipped phones with wildly different hardware specifications, often using more power-hungry processors, so Google had to create a design language that would work with configurations it hadn’t even dreamed about.
Apple, meanwhile, has always exercised more control over its hardware and software package. Because every iPhone had a calibrated LCD display with an always-on backlight, Apple’s design language went in the complete opposite direction of Google’s. iOS 7 made liberal use of bright white rectangles and a palette of accent colours. When iOS 7 was released, I wrote that “if Apple used AMOLED displays […] the interface would be much darker”. Indeed, when Apple released the Apple Watch, its first OLED display product, it featured a visual interface language that is (and remains) dark.
After Apple began shipping iPhones with OLED displays, it introduced a systemwide dark mode in iOS. Dark mode is not limited to OLED display devices; non-OLED iPhones and iPads can toggle dark mode, but it doesn’t look as good. You will note that the Apple Watch does not have a light mode and, even today, there are only a handful of watch faces and app screens that allow the edges of the screen to be clearly defined.
The Apple Watch also marked a turning point in hardware design. The screen’s curved cover glass or sapphire blended seamlessly into the metal body. And, because of the true black of an OLED panel, it became difficult to tell where the edges of the screen lay and the black bezel began, creating a sort of infinity pool effect. Apple’s design guidelines explicitly said to treat them as a unified whole:
Apple Watch was designed to blur the boundaries between device and software. For example, wearers use Force Touch and the Digital Crown to interact seamlessly with onscreen content. Your app should enhance the wearer’s perception that hardware and software are indistinguishable.
When Ian Parker of the New Yorkerinterviewed Jony Ive around the time of the introduction of the Apple Watch, Ive expanded upon this philosophy:
The Apple Watch is designed to remain dark until a wearer raises his or her arm. In the prototypes worn around the Cupertino campus at the end of last year, this feature was still glitchy. For Marc Newson, it took three attempts — an escalation of acting styles, from naturalism to melodrama — before his screen came to life. Under normal circumstances, the screen will then show one of nine watch faces, each customizable. One will show the time alongside a brightly lit flower, butterfly, or jellyfish; these will be in motion, against a black background. This imagery had dominated the launch, and Ive now explained his enthusiasm for it. He picked up his iPhone 6 and pressed the home button. “The whole of the display comes on,” he said. “That, to me, feels very, very old.” (The iPhone 6 reached stores two weeks later.) He went on to explain that an Apple Watch uses a new display technology whose blacks are blacker than those in an iPhone’s L.E.D. display. This makes it easier to mask the point where, beneath a glass surface, a display ends and its frame begins. An Apple Watch jellyfish swims in deep space, and becomes, Ive said, as much an attribute of the watch as an image. On a current iPhone screen, a jellyfish would be pinned against dark gray, and framed in black, and, Ive said, have “much less magic.”
Alan Dye later described to me the “pivotal moment” when he and Ive decided “to avoid the edge of the screen as much as possible.” This was part of an overarching ambition to blur boundaries between software and hardware. (It’s no coincidence, Dye noted, that the “rounded squareness” of the watch’s custom typeface mirrors the watch’s body.) The studio stopped short of banishing screen edges altogether, Dye said, “when we discovered we loved looking at photos on the watch, and you can’t not show the edge of a photo.” He laughed. “Don’t get me wrong, we tried! I could list a number of terrible ideas.” They attempted to blur edges, and squeeze images into circles. There was “a lot of vignetting”—the darkening of a photograph’s corners. “In the end, it was maybe putting ourselves first,” he said.
This philosophy has continued through to the present day WatchOS 7. Consider the app launcher, too: as you drag your finger around, the icon bubbles toward the edges of the device shrink in a way that almost makes it seem like you are rolling your finger around a sphere instead of moving dots laterally.
The same design language was present on the iPhone 6, albeit to a lesser degree of curviness. It may not have been equipped with an OLED panel, but Apple kept developing screen coatings that made blacks blacker and increased contrast. More noticeably, the edge of the glass curved to blend into the rounded edges of the body, establishing the hardware design language for iPhones up until this year.
The original iPhone had a chrome bezel to hide the unresolved connection between the display glass and the aluminum and plastic chassis. Ever since the iPhone 4 and 5, in particular, the physical seams have slowly disappeared. On my iPhone X, there is only a thin black plastic gasket between the face and the stainless steel body, and it is part of a constant curve. My thumb feels only the slightest groove when moving from glass to metal and back again.
iOS’ design language evolved at a similar time as the hardware. The iPhone 6 was introduced in 2014, a year after iOS 7 debuted, but that language felt much more at home on its bigger and curved displays. Notably, the iOS 7 design language incorporated edge-to-edge elements throughout — in notifications, table views, and so on — that seemed to bleed into the bodywork, as did the new gesture for swiping from the left edge of the display to go back.
But iOS has slowly pulled back on the design elements that tried to create an illusion of a continuous product with little separation of hardware and software. Several years ago, notifications, widgets, and Siri results were redesigned as glass bubbles rather than full-width blocks. iOS 13 marked the return of the grouped table view. Even iOS 14’s “full width” widgets are narrower — their size matches the margins of home screen icons, but it also fits a pattern of increasing space around onscreen elements.
Perhaps that is also related to its hardware design language. The bezels of iPhones keep shrinking, so there is a smaller hardware border around the display. This is compensated for in software by increasing the amount of space between onscreen elements and the edge of the display. Otherwise, software elements would appear too wide and overwhelm the physical size of the product — particularly with the larger Max models. It strikes me that these changes mark an evolution of that thinking: the hardware is, effectively, seamless, but it is not edge-less. From a visual design perspective, at least on the iPhone, Apple seems more comfortable delineating the edge of the software and the hardware instead of letting them bleed together to the same extent.
Apple often emphasizes that it develops hardware and software in sync, with the goal of building more unified products. I think we see that in these subtler interface details.
Tesla sent out the first “Full Self-Driving” beta software update to a select group of customers this week, CEO Elon Musk tweeted Tuesday. On an earnings call Wednesday, Musk said more Tesla owners would get the update as the weeks progress, with the goal of a “wide release” by the end of the year.
Only those customers in Tesla’s Early Access Program will receive the software update, that will enable drivers to access Autopilot’s partially automated driver assist system on city streets. The early access program is used as a testing platform to help iron out software bugs.
I saw this article when it was posted to the Verge’s homepage a couple of days ago. It was in the small bottom row of that large grid they have at the top of the page, and I remember thinking “that is pretty subtle placement for a story about how autonomous transportation is now real”. Well, this being Tesla, it isn’t that at all.
Yes, it does plenty of things and is extremely technically impressive, but it’s still a Level 2 system, a very advanced driver-assist system, not by any means fully autonomous. There’s a lot of confusion about this, confusion that Tesla themselves are causing, and it could be dangerous.
It doesn’t fundamentally matter if it can change lanes, go around objects, follow forks in the road, or whatever. Those are all extremely impressive technical achievements, but the system requires the driver to remain vigilant and ready to take over at any second should the system become confused or fail.
That’s by definition not “full self-driving,” and Tesla’s continued use of that descriptor is dangerous.
Maybe this makes me curmudgeonly, but I feel like it is not fair to the general public for Tesla to beta test on public roads software with a name so misleading it might cause some drivers to believe their cars now drive themselves.
Last summer, [YouTube] started taking active and aggressive countermeasures to block IP-addresses that are frequently used by stream-ripping services to download content. Our source describes these blocks as ‘purges’.
In addition to these purges, Google also removes URLs from search results when they are reported by copyright holders, as we alluded to earlier. This is a frustrating experience for bigger site operators, who have to switch to fresh URLs frequently.
In a world where Google did not own YouTube — and would, therefore, not have legal accountability for the misuse of licensed materials — would it be so keen to comply with copyright-based requests to remove stream rippers from search listings? I wonder.
In related news, the RIAA today demanded that GitHub remove the repository and forks of youtube-dl, a popular command line tool for downloading videos from YouTube and other sources.
Apple today silently removed its “Apple TV Remote” app from the App Store, which lets users control the Apple TV from an iPhone or iPad simulating a real Remote. The app is no longer available for download from the App Store and Apple has likely discontinued it, which means that it will no longer get any updates.
That doesn’t come as a surprise since Apple has added the Remote feature built into the Control Center in iOS 12, so Apple TV users can have access to all the controls on Siri Remote without having to download any app.
This is kind of a bummer because the Apple TV Remote app has actual buttons for previous and next. The Control Centre feature is a more faithful onscreen replication of the Siri Remote, which does not have those buttons.
I often use my Apple TV to play music, through my receiver, with the television switched off. There is no way to navigate songs unless the Now Playing controls are in the foreground. And, given that my Apple TV prompts me to configure the volume control every time I wake it even though the volume control is already configured, the playback controls are never actually in the foreground and it is impossible to skip a song with the Control Centre remote feature.
The moral of this story is that buttons are often good.
Update: Via Jason Robinson, you can control an Apple TV through an iPhone’s AirPlay controls even if you are not AirPlaying from that iPhone, and it includes typical playback controls.
It has been over six months since the last live music performance with singing in Calgary. I understand the necessity of restrictions and, given that I have kept my job and have not lost a loved one to covid-19, I do not wish to complain.
But if there is one single thing I cannot wait for the return of, it is live music. I want to stand right up against the barrier next to the stage getting crushed on all sides by other people and the mosh pit behind. I want my favourite musicians to dive off the stage and onto the adoring crowd singing along with them. I miss music made visceral.
Matthew Perpetua has put together a playlist of twenty hours of live music from hundreds of different artists, available on Apple Music and Spotify. Whatever you’re into, I am sure you will find something in this list that you will love. But it starts with MC5’s “Kick Out the Jams” and I see no reason to skip that.
To replicate the experience of every live show I’ve ever attended, spill some cheap beer on your bathroom floor, put this playlist on pretty loud, turn off the lights, and violently shove your chest into the towel rack.
Josh Dzieza of the Verge has an update on Foxconn’s fake LCD factory in Wisconsin:
In many ways, the Foxconn debacle in Wisconsin is the physical manifestation of the alternate reality that has defined the Trump administration. Trump promised to bring back manufacturing, found a billionaire eager to play along, and now for three years the people of Wisconsin have been told to expect an LCD factory that plainly is not there. Into the gap between appearance and reality fell people’s jobs, homes, and livelihoods.
The buildings Foxconn has erected are largely empty. The sphere has no clear purpose. The innovation centers are still vacant. The heart of the project, the million-square-foot “Fab,” is just a shell. In what an employee says was a final cost-cutting measure, only the portion that was to host the Trump visit was ever finished. Recent documents show the “Fab,” once intended for use as manufacturing, has been reclassified as a massive storage facility.
WEDC, as part of its audit of the company’s 2019 subsidy application, had Foxconn survey its employees about what they were working on. Not a single respondent mentioned LCDs because no one is working on LCDs, and they never were.
Every time I think about this story, I remember the episode of “Reply All” from a couple of years ago about the town relationships fractured by the fight over this factory. Its construction is a profoundly selfish and cynical landmark, and it is heartbreaking to think about all of the people who were forced out of their homes to construct it.
That podcast episode spends a fair bit of time covering Dave DeGroot, village president of Mount Pleasant, as he secretly fights to land the Foxconn factory. Dzieza’s article doesn’t mention DeGroot, but I found an article from last month in the Racine Journal Times by Eric Johnson claiming that local officials were “pleased” with Foxconn’s efforts:
“It continues to roll along at its own pace,” Mount Pleasant Village President David DeGroot said of the Foxconn development. “It’s been an interesting ride. Their focus changed somewhat along the way, but I’m thrilled and filled with anticipation at the breadth and depth of the product that they’re going to be producing in Mount Pleasant. We’ve come a long ways from the initial focus on just large screen video monitors. They’ve got their fingers in every market segment that’s out there and we’ll see the fruits of that as time goes on.”
“It (Foxconn) will continue to have a ripple effect, not only in Mount Pleasant, but really throughout southeastern Wisconsin,” [DeGroot] said. “We are going to become known as a technology hub.”
This article mentions that part of the factory was apparently retooled to make masks and other pandemic-related goods, something which Dzieza’s piece does not mention. However, though DeGroot is putting an optimistic spin on the situation his village is now in, it is difficult to reconcile that pride with Foxconn’s 90% shortfall in capital investment compared to its targets.
Small gripe town: WordPress is replacing Adobe TypeKit fonts with Google Fonts. Plus,
This e-mail’s subject was “Improvements to WordPress Custom Fonts”
It doesn’t at all explain WHY
Please don’t tell a designer “these fonts are very similar to the ones you have”
This seems like a pretty quiet change for something that may have a big impact on some websites, but it seems like it was a pretty quiet feature in the first place. Apparently, it has been available for about ten years on Premium plans and up, but I went back through the last few years of WordPress.com’s pricing page and it doesn’t once mention Typekit or Adobe Fonts support. If you search the support site for references to either, you’ll find only two cached snippets that confirm Typekit fonts were available on Premium plans. Now, the only way to specify non-Google custom fonts is to use a much pricier Business plan.
Eric Peacock, in reply to Sasser’s tweet, linked to a piece by Mike Rankin in Creative Pro about an adjacent issue of type families in Adobe Fonts becoming, euphemistically, “retired”:
On June 15, 2020 a number of fonts will be retired from Adobe’s Creative Cloud. In total, about 50 families/700 fonts from the foundries Font Bureau and Carter & Cone will no longer be available to sync. You can find more details in this post on the Adobe Support Community site.
Adobe Fonts can be a fantastic resource for use with the company’s Creative Cloud apps, as it means designers do not have to justify hundreds or thousands of dollars in typography-related billing, which can be especially painful in smaller organizations. It can often be difficult to explain to clients why a particular typeface is warranted instead of a lower-cost or free option, and Adobe Fonts offers a reasonable solution.
According to Adobe, fonts that it has pulled from its catalogue will continue to function for published web projects. That is pretty terrific. But organizations should be careful that their identities do not depend on Adobe’s ability to negotiate licenses with type houses.
A couple weeks ago, a customer reported that my fonts didn’t work in Microsoft Office. I looked up his order. He had bought the fonts in 2015. I asked: have they not worked for five years? Oh no, he said, they were working fine until September 28.
So what happened on September 28?
[…] I theorized that there was probably a buggy Microsoft Office update that went out to his computer that day. (True.) And that rolling back to the previous version would cure the problem. (Also true.)
The cloud-connected computing era is a real mixed bag of compromises. We are now able to use software for predictable monthly costs that we can disable at any time, and we get regular and frequent updates. Much of this software is either browser-based or cross-platform, so the effects of lock-in are reduced. We have access to massive libraries of fonts, music, movies, and television shows entirely on demand. It costs us nothing extra to experiment with so much choice.
It also means that our computers must be either constantly or regularly connected to the internet. There are often non-optional software updates that we must install before being able to do anything else. Those updates often introduce new bugs that may or may not be corrected in a later update, and frequently contain redesigns or different layouts that require us to re-learn something. If we stop paying for software, we lose access to it and, sometimes, files created in its proprietary formats. Our access to a specific piece of media is dependent on whatever licensing agreement a multinational company has been able to strike with another. Despite having near limitless choice, we still watch the same shows.
Benjamin Mullin and Joe Flint, Wall Street Journal:
Quibi Holdings LLC is shutting itself down, according to people familiar with the matter, a crash landing for a once-highflying entertainment startup that raised $1.75 billion in capital.
One of Quibi’s big selling points was its library of shows that could only be watched on users’ mobile phones. That feature proved to be a liability when the pandemic struck, sending viewers into their homes to watch shows on TV. Quibi eventually allowed subscribers to watch its shows on their TVs.
Today, is the end of that sentence. Literally today is the date when Quibi added television apps.
Quibi was founded in August 2018 and launched on April 6 of this year. Not even two hundred days later, the Journal is reporting that it is shutting down. Even if you count from the company’s founding, that has got to be some sort of record for burning through nearly two billion dollars and having nothing to show for it.
Quibi’s quick-bite originals are now streaming on three connected-TV device platforms — Apple TV, Amazon’s Fire TV and Google TV/Android TV. But there might be nothing that can save Jeffrey Katzenberg’s struggling mobile subscription-video startup at this point.
Boy, that sure sounds ominous. At least you can now watch Quibi’s short shows on your television.
On a personal note, this is the first thing I have written about Quibi since it launched in April, so I guess I had better start a “Quibi” tag to keep track of all my posts on the company.
Writing the last post left me feeling like I needed a lighter counterpoint. So, here it is: the latest betas of iOS 14.2 and iPadOS 14.2 have a few new wallpapers. Some of them are mountain landscapes, the others are illustrated. All of them come in day and night variations for light and dark modes, respectively.
The reason why I bring this up — other than because some of these new images are very nice — is that it is a reminder that Apple has not added new Live and Dynamic wallpapers in years. Maybe the former has something to do with the removal of 3D Touch, which would be fair, but I am surprised by how long the Dynamic wallpapers have sat. They are basically the same as the ones included in iOS 7, and they are all pretty dull.
I know a lot of people use their own photo as a wallpaper, but I see many people using one of Apple’s. Updating its collections of motion wallpaper seems like such an easy way to give users a new-feeling device and inject a bit of whimsy into the system.
One of the overarching themes about the many terrible things the current U.S. administration is responsible for is how trust in government has fractured. You can say that to some degree about any democracy at just about any time, but few have created and then exploited such deep cracks, particularly along ideological lines. Not only does this administration have no interest in reaching across the aisle, it actively promotes seeing policy as competition and Americans who are not deferential to it as enemies.
That is what makes the antitrust complaint against Google, filed today by the U.S. Department of Justice and eleven state Attorneys General, so uncomfortable. The current U.S. Attorney General is William Barr, who has a long history of covering up possible government crimes, is partly responsible for the tear gassing of Americans for a photo op, and has demonstrated total loyalty to the current President instead of the office or his country. Before being sworn in as Attorney General in February 2019, Barr was a longtime in-house counsel for Verizon and served on the board of directors of Time Warner. Meanwhile, all of the state Attorneys General who are backing Barr’s suit against Google are Republicans.
This is important because the DOJ has brought only one other antitrust case in the last decade of the Obama and Trump administrations. This is true despite high-profile mergers and acquisitions happening at a blistering pace, particularly in the telecom space. The bar must be pretty high for what the DOJ considers a criminal violation of antitrust laws.
You can read the suit (PDF) if you’d like — it’s not very long and it contains some startling claims. In short, it argues that Google maintains dominance in search partly as the result of allegedly illegal agreements with browser and device makers, and that it abuses its monopoly in its advertising business. Perhaps the most shocking claim in the suit is this:
Apple has not developed and does not offer its own general search engine. Under the current agreement between Apple and Google, which has a multi-year term, Apple must make Google’s search engine the default for Safari, and use Google for Siri and Spotlight in response to general search queries. In exchange for this privileged access to Apple’s massive consumer base, Google pays Apple billions of dollars in advertising revenue each year, with public estimates ranging around $8–12 billion. The revenues Google shares with Apple make up approximately 15–20 percent of Apple’s worldwide net income.
Apple’s RSA incentivizes Apple to push more and more search traffic to Google and accommodate Google’s strategy of denying scale to rivals. For example, in 2018, Apple’s and Google’s CEOs met to discuss how the companies could work together to drive search revenue growth. After the 2018 meeting, a senior Apple employee wrote to a Google counterpart: “Our vision is that we work as if we are one company.”
The current version of the Google–Apple agreement substantially forecloses Google’s search rivals from an important distribution channel for a significant, multi-year term. This agreement covers roughly 36 percent of all general search queries in the United States, including mobile devices and computers. Google estimates that, in 2019, almost 50 percent of its search traffic originated on Apple devices.
There seem to be some well-founded complaints in this case, though Google sees it merely as consumers picking its stuff over the competition. But even its strongest accusations are deeply undercut by the weaponizing of antitrust law for political reasons by Barr and his Republican Party colleagues.
So despite what folks like Josh Hawley and Ted Cruz would have you believe, there’s no evidence that monopoly power has ever been a genuine concern for the modern Trump GOP (simply look at its treatment of telecom, airlines, banks, and countless other heavily consolidated and monopolized sectors that routinely churn out a steady stream of consumer and competitor nightmares). And yet folks who’ve built entire careers on the backs of not giving a flying shit about corporate power, consolidation, and monopolization will now get to spend two weeks before an election pretending otherwise.
I understand those of you tut-tutting me for diving into politics on this post and being frustrated by the lack of escape from the endless election-related articles. I am sorry about that.
But you should be aware that this case is not being brought because of genuine concern for the effect of Google’s market power on the general public and small businesses. I cannot assess the viability of these claims because I am not a lawyer. I have seen some coverage that suggests this is a hack job, and some that sees it as a thoughtful suit. Maybe this really is the best the Department of Justice can do and it will prevail in court. But this administration’s exploitation of cracks in public trust means it is impossible to see the suit absent of the subtext that it is a political cudgel.
The embargo lifted today on reviews of the iPhone 12 and iPhone 12 Pro, and the consensus seems pretty clear: the hype over 5G is unwarranted so far, the industrial design is pretty much the best, the modestly improved cameras are great, the slightly bigger screen sizes are offset by smaller bezels and a thinner chassis, and the “regular” iPhone 12 has so much in common with its same-size Pro sibling that many people will not find the upgrade worth the extra money.
That last point is worth emphasizing. Like the release of the iPhone 8 and X, and the XR and XS, the new iPhone lineup has two different availability dates. The Mini and the Max will be released in a couple of weeks; the iPhones available this Friday are the middle models of identical size and, therefore, are the most easily muddled. If you want a smaller screen, you buy the Mini. If you want the biggest screen and, on paper at least, the best camera, you buy the Max. But it is harder, I think, to choose between the identically-sized and similarly-specced middle models. So it is notable that Apple seeded reviewers with both of them and, in many cases, that leant itself to direct head-to-head comparisons.
I read only a few reviews today but I figured something out about myself: the telephoto camera alone makes the cost of the Pro worth it for me, but that also kind of makes me a sucker. The non-Pro iPhone 12 seems to be just as capable, just as fast, and comes in bolder colours.
Most people won’t be on superfast 5G, and will find the battery life on these phones to be solid. They lasted a full day of fairly heavy use—though fell a bit shorter than the iPhone 11, which consistently leaves me with at least 15% before bed time.
This complaint about slightly shorter battery life for both models seems to be consistent among the reviews as well. It reminded me of comments that “‘Daring Fireball’ blog creator” John Gruber made on CNBC before Apple’s announcement event:
I think one of the biggest problems people have [with their existing phones] is battery life. I think it always has been and will be for the foreseeable future. […] I think, if you said “this phone gets faster cellular networking and this other phone gets twice the battery life” everyone would jump on the one with battery life.
And then, I think, another factor is photographic quality. Everybody wants their pictures and videos to look better. Those are, to me, the two simple and obvious things.
I am not arguing that none of the improvements in this iPhone lineup are worth it, and I understand there are limitations of battery chemistry and physical space. Compromises will be made. But, yeah, if this year’s new iPhones could eke out a couple of hours more battery life instead of adding 5G, I would be happy to make that trade-off.
Update: The Tom’s Guide battery life test found that the iPhone 12 Pro got longer battery life over 4G than the 11 Pro, but the 12 got shorter battery life than the 11. The latter makes sense, as it is a smaller-bodied phone that, presumably, has a smaller battery. The former is not matching with the results of many other testers but it is encouraging.