Robb Knight:

Mastodon 4.3 released today with a bunch of features but the one most people, including me, are excited about is author tags – this isn’t the name of them but they also don’t seem to have a proper name as far as I can tell. Anyway, you need to do two things to get the “More from X” section you can see in the screenshot above. The first is to add the fediverse:creator tag to your site in your head, which I previously wrote about here.

Knight published this at the beginning of the month when Mastodon 4.3 was released, but the instance where I run my personal Mastodon account was only updated recently. I like this addition. It removes the need for centralized verification — which can be confusing — and allows any publisher to confirm the legitimacy of individual authors and their work at once.

Joe Rosensteel, Six Colors:

The photographs you take are not courtroom evidence. They’re not historical documents. Well, they could be, but mostly they’re images to remember a moment or share that moment with other people. If someone rear-ended your car and you’re taking photos for the insurance company, then that is not the time to use Clean Up to get rid of people in the background, of course. Use common sense.

There are clearly ways that image editing tools can overreach. But Clean Up is one of the times when it is valid to compare its effects to those of Photoshop. It is, in fact, the lack of any retouching tools in Apple’s iOS Photos app which has been conspicuous. The difference between the tools available for years in third-party editing apps and Apple’s, though, is in its simplicity — you really do only need to circle an area to remove a distracting element, and it often works pretty well.

Regardless of whether Apple’s A.I. efforts are less advanced than those of its peers or if this is a deliberate decision, I hope we continue to see similar restraint. Image Playgrounds is not tasteful to my eyes, but at least none of it looks photorealistic.

Jaron Schneider, PetaPixel:

Some might believe that Apple isn’t invested in the future of the [Vision] platform either given the niche appeal or the high price, but after speaking with Della Huff (a member of the Product Marketing team at Apple, who oversees all things Camera app and Photos app) and Billy Sorrentino (a member of the the Apple Design Team who works across the company’s entire product line), I left feeling that Apple has every intention of pushing forward in this space.

The two explain that Apple is very much invested in Vision Pro and visionOS because it views the experience they provide as integral to the future of photography. Coming from the company that makes the most popular camera on the planet, that opinion carries significant weight.

I do not mean to be cynical but, well, does it carry significant weight? Of course Apple believes the Vision Pro is the best way to experience photos and videos. The company spent years developing technologies to make the system feel immersive and compelling so, at a minimum, it truly believes the effort was worth it. Also, it is not unreasonable to expect the company to justify the effort, especially after stories about executive retirements and production cutbacks. No wonder Apple is making sure people are aware it is still committed to the space.

But that is not the whole point of this article. The Apple employees interviewed argue — as before — that photos should be representative of an actual event, and that viewing them as an immersive three dimensional reconstruction is, psychologically, much closer to how our memory works. I would love for some bored neuroscientist to fact-check that claim because — and keep in mind I went to art school, so, pinch of salt — it seems to me to conflict with the known fragility of human memory. My suspicion is that this is one reason we are drawn to fuzzier representations of reality: the unique colour representation of film, or the imprecision of a needle reading a vinyl record.

I am not being facetious when I write that I am very curious about how well this actually works compared to standard photos or videos or, indeed, actual memories.

Ahmad Alfy:

My first encounter with text fragments was through links generated by Google Search results. Initially, I assumed it was a Chrome-specific feature and not part of a broader web standard. However, I soon realized that this functionality was actually built upon the open web, available to any browser that chooses to implement it.

As someone who writes essays containing citations, this is one of the nicest additions to the web that I wish was easier to use in Safari. What I, like Alfy, want to be able to do is highlight a specific phrase and copy a direct link.

Also, something I often forget is that you can link directly to specific pages of a PDF file by appending #page= and then the page number.

Update: Turns out Rogue Amoeba has a bookmarklet for putting a link to the selected text in your clipboard. Very nice. (Thanks to Nick Vance.)

Hank Green, who lives in Montana, spotted something weird with his mail-in ballot: in all categories, the Democrat candidate was listed last. That seemed odd. So he looked it up:

Since every district — and there’s a ton of them — is getting a different ballot anyway, the candidates on each ballet — it turns out — are in a different order.

To eliminate bias, the candidates are initially ordered in alphabetical order. But then, it gets shifted by one for every ballot in the sequence.

Clever.

Green quotes the saying “everything is a conspiracy theory if you don’t know anything” but, as he is wont to point out, that is negative and unkind. A better version, he says, is “everything is a conspiracy theory when you don’t trust anything”.

I like that.

Mike Masnick, Techdirt:

I’d add a caveat to that as well, though. You have to not trust anything and also not have the intellectual curiosity to find out what’s true. Hank is the kind of person who does have that intellectual curiosity. Even though he was initially concerned, before he spouted off, he did the research and found out that his concerns were unfounded.

I think Masnick’s addition is fair, but also a little redundant, I believe. Someone who lacks trust to the degree of believing fantastical tales about the world is also someone who, upon looking things up, will disregard what they are reading. Being intellectually curious requires trust: in others, to provide accurate information; and in oneself to admit a lack of knowledge, and be able to assess new information.

Brian Krebs:

In an interview, Atlas said a private investigator they hired was offered a free trial of Babel Street, which the investigator was able to use to determine the home address and daily movements of mobile devices belonging to multiple New Jersey police officers whose families have already faced significant harassment and death threats.

[…]

Atlas says the Babel Street trial period allowed its investigator to find information about visitors to high-risk targets such as mosques, synagogues, courtrooms and abortion clinics. In one video, an Atlas investigator showed how they isolated mobile devices seen in a New Jersey courtroom parking lot that was reserved for jurors, and then tracked one likely juror’s phone to their home address over several days.

Krebs describes a staggering series of demonstrations by the investigator for Atlas, plaintiff in a suit against Babel Street: precise location tracking of known devices, or dragnet-style tracking of a cluster of devices, basically anywhere. If you or I collected device locations and shared it with others, it would be rightly seen as creepy — at the very least. Yet these intrusive behaviours have been normalized. They are not. What they are doing ought to be criminal.

It is not just Babel Street. Other names have popped up over the years, including Venntel and Fog Data Science. Jack Poulson, who writes All-Source Intelligence, has an update on the former:

According to a public summary of a contract signed in early August, the U.S. Federal Trade Commission has opened an inquiry into the commercial cellphone location-tracking data broker Venntel and its parent company Gravy Analytics. […]

Gravy Analytics’ data, via Venntel, is apparently one of the sources for Babel Street’s tracking capabilities.

You might remember Babel Street; I have linked to [several stories][st] about the company. This reporting was most often done by Byron Tau, then at the Wall Street Journal, and Joseph Cox, then at Vice. Tau wrote a whole book about the commercial surveillance apparatus. Both reporters were also invited to the same demo as Krebs saw; Tau’s story, at Notus, is login-walled:

The demonstration offers a rare look into how easily identifiable people are in these location-based data sets, which brokers claim are “anonymized.”

Such claims do not hold up to scrutiny. The tools in the hands of capable researchers, including law enforcement, can be used to identify specific individuals in many cases. Babel’s tool is explicitly marketed to intelligence analysts and law enforcement officers as a commercially available phone-tracking capability — a way to do a kind of surveillance that once required a search warrant inside the U.S. or was conducted by spy agencies when done outside the U.S.

Cox now writes at 404 Media:

Atlas also searched a school in Philadelphia, which returned nearly 7,000 devices. Due to the large number of phones, it is unlikely that these only include adult teachers, meaning that Babel Street may be holding onto data belonging to children too.

All these stories are worth your time. Even if you are already aware of this industry. Even if you remember that vivid New York Times exploration of an entirely different set of data brokers published six years ago. Even if you think Apple is right to allow users to restrict access to personal data.

This industry is still massive and thriving. It is still embedded in applications on many of our phones, by way of third-party SDKs for analytics, advertising, location services, and more. And it is deranged that the one government that can actually do something about this — the United States — is doing so one company and one case at a time. Every country should be making it illegal to do what Babel Street is capable of. But perhaps it is too rich a source.

Juli Clover, MacRumors:

Disney is no longer allowing its customers to sign up for and purchase subscriptions to Hulu or Disney+ through Apple’s App Store, cutting out any subscription fees that Disney would have needed to pay to Apple for using in-app purchase.

As of writing a day after Disney made this change, Disney Plus is still listed as a member on Apple’s Video Partner Program page. I wrote about that program four years ago in the context of Apple seemingly retconning it into being a longstanding and “established” option available to developers of media applications.

Joe Rosensteel:

More importantly, Disney is increasingly concerned with flexible tiers and bundles so that they can charge more. Especially when Disney launches their ESPN service later, which is almost guaranteed to be incredibly expensive. Disney will try to offset that with bundles. I’m sure Disney might even want to toy around with locking people into yearly subscriptions paid on a monthly basis, à la cable TV.

Despite Apple being Disney’s BFF, Disney needs to have infrastructure to handle all these bundles and tiers, which will be very expensive, so why involve Apple acting as a glorified payment processor?

It is hard to feel anything at all, really, about the business decisions of one massive conglomerate compared to another. But Apple’s subscription management is — in a vacuum and distinct from anything else — one of the nicest around, and it ultimately hurts users that it is so unattractive to some developers when given other options.

On a related note, the U.S. Federal Trade Commission just announced a final set of rules to make cancelling a subscription as easy as starting it. Michael Tsai:

While it was good that in some cases customers could get easier cancellation by paying for an additional layer such as the App Store, I think it makes sense to just make these bad practices illegal.

I am in full agreement. The FTC’s policies are a good idea and should be copied by every national or supranational consumer protection body.

Hayden Anhedönia, who you might know as Ethel Cain:

I just feel as though there’s a lack of sincerity in the world these days. I speak from personal experience as an artist putting things out into the world, yes, but also as a human being interacting with other human beings on the regular, and I have had my sentiments echoed by many other friends of mine over the past year or so, both artists and non-artists alike. Most of this will be framed through the consumption of art, because that’s my own personal passion in this life of mine, but also the way we interface with each other and process the world around us. […]

I do not know that the internet is becoming less sincere with time, but it sure feels like there is often an unwillingness to engage directly with a topic. How often do you see questions in local subreddits answered with jokes, or stories of horrific events replied to with no emotional intelligence? This has always been an issue in different pockets of the web — and in the real world — but it seems to get worse the larger and looser-knit the group becomes, like if a recommendations-based social media platform yanks something out of context and thrusts it before a whole different audience.

Jokes are fine. But not every reply chain or comment thread needs to become a place to try new bits for a never-to-be-performed standup routine.

danah boyd:

Since the “social media is bad for teens” myth will not die, I keep having intense conversations with colleagues, journalists, and friends over what the research says and what it doesn’t. (Alice Marwick et. al put together a great little primer in light of the legislative moves.) Along the way, I’ve also started to recognize how slipperiness between two terms creates confusion — and political openings — and so I wanted to call them out in case this is helpful for others thinking about these issues.

In short, “Does social media harm teenagers?” is not the same question as “Can social media be risky for teenagers?

This is pretty clearly a response to arguments pushed by people like Dr. Jonathan Haidt. One thing he often laments is the decline in kids walking to school and, he says, playing outside with relatively little supervision. This is something he also griped about in his previous book “The Coddling of the American Mind”, co-written with Greg Lukianoff. If you start poking around a little, the factors parents’ cite for their reluctance to allow kids to get to school independently are safety risks: drivers, vehicles, roads, and strangers. You see it in articles from Australia, Canada, Ireland, the United Kingdom, and the United States. These are undoubtably risks but, as Haidt himself points out in supplemental material for “Coddling”, efforts should be made to “prepare the child for the road, not the road for the child”.

Then again, why not both? Kids can be educated on how to use new technologies responsibly and platforms can be pressured to reduce abuses and hostile behaviour. Legislators should be passing privacy-protecting laws. But, as boyd writes, “I don’t think that these harms are unique to children”. If we design roads which are safer for children, they will probably also be safer for everyone — but that does not eliminate risk. A similar effect can be true of technology, too. (I just finished “Killed by a Traffic Engineer”. I found the writing often insufferable, but it is still worth reading.)

I do not have a stake in this game beyond basic humanity and a desire for people to be healthy. I have no expertise in this area. I find it plausible it is difficult to disentangle the influence of social media from other uses of a smartphone and from the broader world. I am not entirely convinced social media platforms have little responsibility for how youth experience their online environment, but I am even less convinced Haidt’s restrictive approach makes sense.

See Also: On the same day boyd’s essay was published, Dr. Candice Odgers and Haidt debated this topic live.

X on Wednesday announced a new set of terms, something which is normally a boring and staid affair. But these are a doozy:

Here’s a high-level recap of the primary changes that go into effect on November 15, 2024. You may see an in-app notice about these updates as well.

  • Governing law and forum changes: For users residing outside of the European Union, EFTA States, and the United Kingdom, we’ve updated the governing law and forum for lawsuits to Texas as specified in our terms. […]

Specifically, X says “disputes […] will be brought exclusively in the U.S. District Court for the Northern District of Texas or state courts located in Tarrant County, Texas, United States”. X’s legal address is on a plot of land shared with SpaceX and the Boring Company near Bastrop, which is in the Western District. This particular venue is notable as the federal judge handling current X litigation in the Northern District owns Tesla stock and has not recused himself in X’s suit against Media Matters, despite stepping aside on a similar case because of a much smaller investment in Unilever. The judge, Reed O’Connor, is a real piece of work from the Federalist Society who issues reliably conservative decisions and does not want that power undermined.

An investment in Tesla does not necessarily mean a conflict of interest with X, an ostensibly unrelated company — except it kind of does, right? This is the kind of thing the European Commission is trying to figure out: are all of these different businesses actually related because they share the same uniquely outspoken and influential figurehead? Musk occupies such a particularly central role in all these businesses and it is hard to disentangle him from their place in our society. O’Connor is not the only judge in the district, but it is notable the company is directing legal action to that venue.

But X is only too happy to sue you in any court of its choosing.

Another of the X terms updates:

  • AI and machine learning clarifications: We’ve added language to our Privacy Policy to clarify how we may use the information you share to train artificial intelligence models, generative or otherwise.

This is rude. It is a “clarifi[cation]” described in vague terms, and what it means is that users will no longer be able to opt out of their data being used to train Grok or any other artificial intelligence product. This appears to also include images and video, posts in private accounts and, if I am reading this right, direct messages.

Notably, Grok is developed by xAI, which is a completely separate company from X. See above for how Musk’s companies all seem to bleed together.

  • Updates to reflect how our products and services work: We’ve incorporated updates to better reflect how our existing and upcoming products, features, and services work.

I do not know what this means. There are few product-specific changes between the old and new agreements. There are lots — lots — of new ways X wants to say it is not responsible for anything at all. There is a whole chunk which effectively replicates the protections of Section 230 of the CDA, you now need written permission from X to transfer your account to someone else, and X now spells out its estimated damages from automated traffic: $15,000 USD per million posts every 24 hours.

Oh, yeah, and X is making blocking work worse:

If your posts are set to public, accounts you have blocked will be able to view them, but they will not be able to engage (like, reply, repost, etc.).

The block button is one of the most effective ways to improve one’s social media experience. From removing from your orbit people who you never want to hear from for even mundane reasons, to reducing the ability for someone to stalk or harass, its expected action is vital. This sucks. I bet the main reason this change was made is because Musk is blocked by a lot of people.

All of these changes seem designed to get rid of any remaining user who is not a true believer. Which brings us to today.

Sarah Perez, TechCrunch:

Social networking startup Bluesky, which just reported a gain of half a million users over the past day, has now soared into the top five apps on the U.S. App Store and has become the No. 2 app in the Social Networking category, up from No. 181 a week ago, according to data from app intelligence firm Appfigures. The growth is entirely organic, we understand, as Appfigures confirmed the company is not running any App Store Search Ads.

As of writing, Bluesky is the fifth most popular free app in the Canadian iOS App Store, and the second most popular free app in the Social Networking category. Threads is the second most popular free app, and the most popular in the Social Networking category.

X is number 74 on the top free apps list. It remains classified as “News” in the App Store because it, like Twitter, has always compared poorly against other social media apps.

Gian Volpicelli and Samuel Stolton, Bloomberg:

Under the EU’s Digital Services Act, the bloc can slap online platforms with fines of as much as 6% of their yearly global revenue for failing to tackle illegal content and disinformation or follow transparency rules. Regulators are considering whether sales from SpaceX, Neuralink, xAI and the Boring Company, in addition to revenue generated from the social network, should be included to determine potential fines against X, people familiar with the matter said, asking not to be identified because the information isn’t public.

These are all businesses privately owned by Elon Musk; Tesla, as a publicly traded company, is reportedly not being factored into the calculation. According to a Bloomberg source, the Commission is trying to decide if they should be penalizing the owner of the business and not the business itself.

Matt Levine, in Bloomberg’s Money Stuff newsletter:

See, you’re not really supposed to do that: X is its own company, with its own corporate structure and owners; 6% of X’s revenue is 6% of X’s revenue, not 6% of the revenue of Musk’s other companies. But if everyone thinks of the Musk Mars Conglomerate as a single company, then there’s a risk that it will be treated that way.

I can see how the penalty formula should not be stymied by carefully structured corporations. There should be a way to fine businesses breaking the law, even if their ownership is obfuscated.

But that is not what is happening here. As reported, this seems like an overreach to me. Even though Musk himself disregards barriers between his companies, as Levine also documents, a penalty for the allegedly illegal behaviour of X should probably be levied only against X.

Dominic Wellington responded thoughtfully to speculation, including my own that a device management key for suppressing screen recording alerts in MacOS Sequoia was added in part because of employee monitoring software:

[…] I know perfectly well that these sorts of tools exist and are deployed by companies, but I suspect they are more prevalent in the sorts of lower-paid jobs that don’t rate fancy expensive Macs. This is why I don’t think employee surveillance (or test proctoring, which is Nick Heer’s other example) can be sufficient explanation for Apple walking back the frequency of this notification. Meanwhile, Zoom et al are near-universal on corporate Macs, and are going to be correspondingly closer to top of mind for administrators of Mac fleets.

This is a fair and considered response, and I think Wellington is right. Even though screen recording capabilities are widespread in employee surveillance products, I do not know that they are very popular. I oversold the likelihood of this being a reflection of that software.

Joe Rossignol, MacRumors:

Apple sells two external displays, including the Pro Display XDR and the Studio Display, but neither has received hardware upgrades in years. In fact, the Pro Display XDR is nearly five years old, having been released all the way back in December 2019.

Via Michael Tsai:

This is not surprising, since Apple has historically taken a long time to update its displays. I don’t think the panels necessarily need to be updated. But it’s disappointing because the Studio Display has well documented camera problems and power issues. I had high hopes that, coming from Apple, it would be reliable as a USB hub, but I end up directly connecting as many storage devices as possible to the meager ports on my MacBook Pro.

Displays are a product category conducive to infrequent updates. The plentiful problems I have been reading with the Studio Display, in particular, worry me. Most sound like software problems, but that is not consolation. Apple’s software quality has been insufficiently great for years and, so, it does not surprise me that a display running iOS is not as reliable as a display that does not use an entire mobile operating system.

Charlie Warzel, the Atlantic:

Even in a decade marred by online grifters, shameless politicians, and an alternative right-wing-media complex pushing anti-science fringe theories, the events of the past few weeks stand out for their depravity and nihilism. As two catastrophic storms upended American cities, a patchwork network of influencers and fake-news peddlers have done their best to sow distrust, stoke resentment, and interfere with relief efforts. But this is more than just a misinformation crisis. To watch as real information is overwhelmed by crank theories and public servants battle death threats is to confront two alarming facts: first, that a durable ecosystem exists to ensconce citizens in an alternate reality, and second, that the people consuming and amplifying those lies are not helpless dupes but willing participants.

On one of the bonus episodes of “If Books Could Kill”, the hosts discuss Harry Frankfurt’s “On Bullshit” which, after they re-read it, disappointed them. They thought the idea was interesting but were frustrated by the lack of examples and, in trying to find examples of their own, found it difficult to find those which were only bullshit and not lies.

I feel as though they missed the most obvious family of examples: all conspiracy theories necessarily become bullshit, if they did not already begin that way. Consider how the theories cited by Warzel begin with a nugget of truth, from which a theory is extrapolated to serve a narrative role — against (typically) Democratic Party politicians, against Jewish people, against scientific understanding, in favour of a grand unifying order that purportedly explains everything. The absence of evidence for a conspiracy theory is, itself, evidence to believers. All of this is steeped in bullshit. Believers in these things do not care to find understanding in known facts; rather, they perceive the world through this lens and bullshit until it all fits.

This story by Warzel documents that trajectory with perfect pitch. It is now politically incorrect in many circles to have beliefs that align with those of experts in their fields. Regardless of what is being discussed, the only safe speech is aggrieved bullshit. In a disaster, however, such speech can be dangerous if people believe it.

Apple in the release notes for MacOS 15.1 beta:

Applications using our deprecated content capture technologies now have enhanced user awareness policies. Users will see fewer dialogs if they regularly use apps in which they have already acknowledged and accepted the risks.

John Gruber:

Why in the world didn’t Apple take regular use of a screen-recording app into account all along?

Benjamin Brooks:

I think this is the question you ask when you have not used a Corporate Mac in the last 4-5 years. For those who are, you know that companies install applications which take screenshots and screen recordings of certain or all activities being done on the Mac. You know, for security.

When users began noticing the screen recording permissions prompt over the summer, I remember lots of people speculating Apple added it because of possible spyware or domestic violence behaviour. That is a plausible explanation.

But Brooks’ keen observation is something I, in hindsight, should have also considered, and I am kicking myself for forgetting about the possibility. I now remember linking to things like employee surveillance software and online test proctoring — applications which monitor users’ screens effectively by force, something one will agree to unless they want to change jobs or not complete an exam. I believe this is supported by — and casts a new light upon — a device management key available to system administrators for suppressing those permissions prompts.

I am not much of a true crime podcast listener, but the first three episodes of “Kill List” — Overcast link — have transfixed me.

Jamie Bartlett:

Besa Mafia was a dark net site offering hitmen for hire. It worked something like this: a user could connect to the site using the Tor browser and request a hit. They’d send over some bitcoin (prices started from $5,000 USD for ‘death by shotgun’). Then they’d upload the name, address, photographs, of who they wanted killed. Plus any extra requests: make it look like a bungled robbery; need it done next week, etc. The website owner, a mysterious Romanian called ‘Yura’ would then connect them with a specialist hitman to carry out the commission.

[…]

In the end, Carl investigated one hundred and seventy five kill requests. Each one a wannabe murderer. Each one a potential victim — who Carl often phones and break the crazy news. “The hardest calls I’ve ever made” Carl tells me. “How do you explain that someone wants you dead?!” (Carl would be indirect, gentle. He tried to make sure the victim felt in control. But often they hung up. “They didn’t believe me. They thought I was a scammer”).

I am not sure I agree with Bartlett’s conclusion — “more and more complex crimes will be solved by podcast journalists” is only true to the extent any crime is “solved” by any journalist — but it does appear this particular podcast has had quite the impact already. What a fascinating and dark story this is.

A bit of background, for those not steeped in the world of WordPress development: there exists a plugin called Advanced Custom Fields (ACF) which allows developers to create near-endless customization options for end clients in the standard page and post editor. It is hard to explain in a single paragraph — the WordPress.com guide is a good overview — but its utility is so singular as to be an essential component for many WordPress developers.

ACF was created by Elliot Condon who, in 2021, sold it to Delicious Brains. At this point, it was used on millions of websites, a few of which I built. I consider it near-irreplaceable for some specific and tricky development tasks. A year later, the entire Delicious Brains plugin catalogue was sold to WPEngine.

Matt Mullenweg:

On behalf of the WordPress security team, I am announcing that we are invoking point 18 of the plugin directory guidelines and are forking Advanced Custom Fields (ACF) into a new plugin, Secure Custom Fields. SCF has been updated to remove commercial upsells and fix a security problem.

[…]

Similar situations have happened before, but not at this scale. This is a rare and unusual situation brought on by WP Engine’s legal attacks, we do not anticipate this happening for other plugins.

This is an awfully casual way of announcing WordPress is hijacking one of the most popular third-party plugins in the directory. Mullenweg cites policy for doing so — WordPress can “make changes to a plugin, without developer consent, in the interest of public safety” — but the latter paragraph I quoted above makes clear the actual motive here. The “security problem” triggering this extraordinary action is a real but modest change to expand a patch from a previous update. But WordPress has removed the ability for WPEngine to make money off its own plugin — and if users have automatic plugin updates turned on, their ACF installation will be overwritten with WordPress’ unauthorized copy.

Iain Poulson, of ACF:

The change to our published distribution, and under our ‘slug’ which uniquely identifies the ACF plugin and code that our users trust in the WordPress.org plugin repository, is inconsistent with open source values and principles. The change made by Mullenweg is maliciously being used to update millions of existing installations of ACF with code that is unapproved and untrusted by the Advanced Custom Fields team.

It is nearly impossible to get me to feel sympathetic for anything touched by private equity, but Mullenweg has done just that. He really is burning all goodwill for reasons I cannot quite understand. I do understand the message he is sending, though: Mullenweg is prepared to use the web’s most popular CMS and any third-party contributions as his personal weapon. Your carefully developed plugin is not safe in the WordPress ecosystem if you dare cross him or Automattic.

I have been trying to stay informed of the hostile relationship between WordPress, Automattic, and Matt Mullenweg, and third-party hosting company WPEngine. Aram Zucker-Scharff put together a helpful and massive set of links to news coverage. Michael Tsai has a good collection of links, too, and Emma Roth and Samantha Cole have published notable articles.

From a distance, it looks like an expensive pissing match between a bunch of increasingly unlikable parties, and I would very much appreciate if it never affects my self-hosted version of WordPress. Maybe it is a little confusing that WPEngine is not affiliated with WordPress, but I only learned this week that WordPress.org is personally owned by Mullenweg and is not actually affiliated with Automattic or WordPress.com. From Mullenweg’s perspective, this confusion is beneficial, but the confusion with WPEngine is not. From my perspective, I would not like to be confused.

Also, if Mullenweg is mad about WPEngine — and Silver Lake, its private equity owner — benefitting from the open source nature of WordPress without what he feels is adequate compensation, I am not sure he has a leg to stand on. It does not sound like WPEngine is doing anything illegal. It is perhaps rude or immoral to build a private business named after and on the back of an open source project without significantly contributing, but surely that is the risk of developing software with that license. I am probably missing something here.

Well, add XOXO to the list of conferences I was never able to attend. The final edition occurred this year and it looked pretty special.

Happily, if you — as I — were unable to attend in person, Andy Baio has begun uploading videos of this year’s talks. I have watched those from Cabel Sasser, Dan Olson, Molly White, and Sarah Jeong. These are all worth your time — and so are, I am sure, the ones I have not yet seen.

Update: Be sure to watch Sasser’s talk before exploring an amazing archive he is assembling. Seriously — watch first, then click.