A unique consequence of writing about the biggest computer companies, which are all based in the United States, from most any other country is a lurking sense of invasion. I do not mean this in an anti-American sense; it is perhaps inherent to any large organization emanating from the world’s most powerful economy. But there is always a sense that the hardware, software, and services we use are designed by Americans often for Americans. You can see this in a feature set inevitably richer in the U.S. than elsewhere, language offerings that prioritize U.S. English, pricing often pegged to the U.S. dollar, and — perhaps more subtly — in the values by which these products are created and administered.
These are values that I, as someone who resides in a country broadly similar to the U.S., often believe are positive forces. A right to free expression is among those historically espoused by these companies in the use of their products. But over the past fifteen years of their widespread use, platforms like Facebook, Instagram, Twitter, and YouTube have established rules of increasing specificity and caution to restrict what they consider permissible. That, in a nutshell, is the premise of Jillian C. York’s 2021 book, Silicon Values.
Though it was published last year, I only read it recently. I am glad I did, especially with several new stories questioning the impact of a popular tech company an ocean away. TikTok’s rapid rise after decades of industry dominance by American giants is causing a re-evaluation of an America-first perspective. Om Malik put it well:
For as long as I can remember, American technology habits did shape the world. Today, the biggest user base doesn’t live in the US. Billion-plus Indians do things differently. Ditto for China. Russia. Africa. These are giant markets, capable of dooming any technology that attempts a one-size-fits-all approach.
The path taken by York in Silicon Values gets right up to the first line of this quote from Malik. In the closing chapter, York (228) writes:
I used to believe that platforms should not moderate speech; that they should take a hands-off approach, with very few exceptions. That was naïve. I still believe that Silicon Valley shouldn’t be the arbiter of what we can say, but the simple fact is that we have entrusted these corporations to do just that, and as such, they must use wisely the responsibility that they have been given.
I am not sure this is exactly correct. We often do not trust the judgements of moderation teams, as evidenced by frequent complaints about what is permissible and, more often, what gets flagged, demonetized, or removed. As I was writing this article, reporters noted that Twitter took moderation action against doctors and scientists posting factual, non-controversial information about COVID-19. This erroneous flagging was reverted, but it is another in a series of stories about questionable decisions made by big platforms.
In fact, much of Silicon Values is about the tension between the power of these giants to shape the permissible bounds of public conversations and their disquieting influence. At the beginning of the book, York points to a 1946 U.S. Supreme Court decision, Marsh v. Alabama, which held that private entities can become sufficiently large and public to require them to be subject to the same Constitutional constraints as government entities. Though York says this ruling has “not as of this writing been applied to the quasi-public spaces of the internet” (14), I found a case which attempted to use Marsh to push against a moderation decision. In an appellate decision in Prager University v. Google, Judge M. Margaret McKeown wrote (PDF) “PragerU’s reliance on Marsh is not persuasive”. More importantly, McKeown reflected on the tension between influence and expectations:
Both sides say that the sky will fall if we do not adopt their position. PragerU prophesizes living under the tyranny of big-tech, possessing the power to censor any speech it does not like. YouTube and several amicus curiae, on the other hand, foretell the undoing of the Internet if online speech is regulated. While these arguments have interesting and important roles to play in policy discussions concerning the future of the Internet, they do not figure into our straightforward application of the First Amendment.
All of the subjects concerned being American, it makes sense to judge these actions on American legal principles. But even if YouTube were treated as an extension of government due to its size and required to retain every non-criminal video uploaded to its service, it would make as much of a political statement elsewhere, if not more. In France and Germany, it — like any other company — must comply with laws that require the removal of hate speech, laws which in the U.S. would be unconstitutional. York (19) contrasts their eager compliance with Facebook’s memorable inaction to rein in hate speech that contributed to the genocide of Rohingya people in Myanmar. Even if this is a difference of legal policy — that France and Germany have laws but Myanmar does not — it is clearly unethical for Facebook to have inadequately moderated this use of its platform.
The concept of an online world no longer influenced largely by U.S. soft power brings us back to the tension with TikTok and its Chinese ownership. It understandably makes some people nervous for the most popular social media platform for many Americans has the backing of an authoritarian regime. Some worry about the possibility of external government influence on public policy and discourse, though one study I found reflects a clear difference in moderation principles between TikTok and its Chinese-specific counterpart Douyin. Some are concerned about the mass collection of private data. I get it.
But from my Canadian perspective, it feels like most of the world is caught up in an argument between a superpower and a near-superpower, with continued dominance by the U.S. preferable only by comparison and familiarity. Several European countries have banned Google Analytics because it is impossible for their citizens to be protected against surveillance by American intelligence agencies. The U.S. may have legal processes to restrict ad hoc access by its spies, but those are something of a formality. Its processes are conducted in secret and with poor public oversight. What is known is that it rarely rejects warrants for surveillance, and that private companies must quietly comply with document requests with little opportunity for rebuttal or transparency. Sometimes, these processes are circumvented entirely. The data broker business permits surveillance for anyone willing to pay — including U.S. authorities.
The privacy angle holds little more weight. While it is concerning for an authoritarian government to be on the receiving end of surveillance technologies rather than advertising and marketing firms, it is unclear that any specific app disproportionately contributes to this sea of data. Banning TikTok does not make for a meaningful reduction of visibility into individual behaviours.
Even concerns about how much a recommendation algorithm may sway voter intent smell funny. Like Facebook before it, TikTok has downplayed the seriousness of its platform by framing it as an entertainment venue. As with other platforms, disinformation on TikTok spreads and multiplies. These factors may have an effect on how people vote. But the sudden alarm over yet-unproved allegations of algorithmic meddling in TikTok to boost Chinese interests is laughable to those of us who have been at the mercy of American-created algorithms despite living elsewhere. American state actors have also taken advantage of the popularity of social networks in ways not dissimilar from political adversaries.
However, it would be wrong to conclude that both countries are basically the same. They obviously differ in their means of governance and the freedoms afforded to people. The problem is that I should not be able to find so many similarities in the use of technology as a form of soft power, and certainly not for spying, between a democratic nation and an authoritarian one. The mount from which Silicon Values are being shouted looks awfully short from this perspective.
You do not need me to tell you that decades of undermining democracy within our countries has caused a rise in autocratic leanings, even in countries assumed stable. The degradation of faith in democratic institutions is part of a downward spiral caused by internal undermining and a failure to uphold democratic values. Again, there are clear differences and I do not pretend otherwise. You will not be thrown in jail for disagreeing with the President or Prime Minister, and please spare me the cynical and ridiculous “yet!” responses.
I wish there were a clear set of instructions about where to go from here. Silicon Values is, understandably, not a book about solutions; it is an exploration of often conflicting problems. York delivers compelling defences of free expression on the web, maddening cases where newsworthy posts were removed, and the inequity of platform moderation rules. It is not a secret, nor a compelling narrative, that rules are applied inconsistently, and that famous and rich people are treated with more lenience than the rest of us. But what York notes is how aligned platforms are with the biases of upper-class white Americans; not coincidentally, the boards and executive teams of these companies are dominated by people matching that description.
The question of how to apply more local customs and behaviours to a global platform is, I believe, the defining challenge of the next decade in tech. One thing seems clear to me: the world’s democracies need to do better. It should not be so easy to point to similarities in egregious behaviour; corruption of legal processes should not be so common. I worry that regulators in China and the U.S. will spend so much time negotiating which of them gets to treat the internet as their domain while the rest of us get steamrolled by policies that maximize their self-preferencing.
This is especially true as waves of stories have been published recently alleging TikTok and its adjacent companies have suspicious ties to arms of an autocratic state. Lots of TikTok employees apparently used to work for China’s state media outlets and, in another app from ByteDance, TikTok’s owner, pro-China stories were regularly promoted while critical news was minimized. ByteDance sure seems to be working more closely with government officials than operators of other social media platforms. That is probably not great; we all should be able to publish negative opinions about lawmakers and big businesses without fear of reprisal.
There is a laundry list of reasons why we must invest more in our democratic institutions. One of them is, I believe, to ensure a clear set of values projected into the world. One way to achieve that is to prefer protocols over platforms. It is impossible for Facebook or Twitter or YouTube to be moderated to the full expectations of its users, and the growth of platforms like Rumble is a natural offshoot of that. But platforms like Rumble which trumpet their free speech bonafides are missing the point: moderation is good, normal, and reinforces free speech principles. It is right for platform owners to decide the range of permissible posts. What is worrying is the size and scope of them. Facebook moderates the discussions of billions — with a b and an s — of people worldwide. In some places, this can permit greater expression, but it is also an impossible task to monitor well.
The ambition of Silicon Valley’s biggest businesses has not gone unnoticed outside of the U.S. and, from my perspective, feels out of place. Yes, the country’s light touch approach to regulation and generous support of its tech industry has brought the world many of its most popular products and services. But it should not be assumed that we must rely on these companies built in the context of middle- and upper-class America. That is not an anti-American statement; nothing in this piece should be construed as anti-American. Far from it. But I am dismayed after my reading of Silicon Values. What I would like is an internet where platforms are not so giant, common moderation actions are not viewed as weapons, and more power is in more relevant hands.