How the new UK tech law hurts Wikipedia

Ellery Roberts Biddle


It has been an incredibly difficult three weeks in the world, and the internet shows it. In the last couple of newsletters, I’ve noted just how hard it is to find reliable information on the social web right now, where everything seems to revolve around attention, revenue and shock value, and verified facts are few and far between. So this week, I’m turning my attention to a totally different part of the internet: Wikipedia. 

It’s been on my mind lately because of the proposed new online safety law in the U.K. that will set strict age requirements for young people online and require websites to scan and somehow remove all content that could be harmful to kids before it appears online. In a recent blogpost for the Wikimedia Foundation — the non-profit that supports Wikipedia — Vice President for Global Advocacy Rebecca MacKinnon wrote that by requiring sites to scan literally everything before it gets posted, the bill could upend the virtual encyclopedia’s bottom-up approach to content creation. As she put it, the law could destroy Wikipedia’s system “for maintaining encyclopedic integrity.”

You may be wondering precisely what “encyclopedic integrity” means at Wikipedia, where the article on the Marvel Comics character Spider-Man cites almost twice as many sources as the article for the Republic of Chad, a country of an estimated 18.5 million people. I get it. Wikipedia, by its own admission, has had problems with an overrepresentation of the interests of nerdy white male American 20-somethings who have too much time on their hands. But these people also really care about what they post online, and they have created an effective cooperative system for collecting, verifying and building knowledge. The system is totally dependent on the good will of thousands of contributors, and it is wholly decentralized — there are Wikipedia communities across the globe who share some basic principles, but decide together how they’ll handle contributions that could violate the law, offend readers or anything in between. In sharp contrast to corporate social media spaces, where attention is the driver of all things, this is a totally different way to “scale up” — more like scaling out — and it has led to a dramatically different kind of information resource.

I recently spoke with two Wikipedia volunteers in Wales, who are seriously worried about the effects that the U.K. bill might have on Wikipedia’s Welsh-language site, which is the only Wikipedia community that exists almost entirely within the jurisdiction of the U.K. Robin Owain and Jason Evans explained to me just how essential Wikipedia has become for Welsh speakers — with 90 million views in the last 12 months, Welsh Wikipedia is the largest and most popular Welsh-language website on the internet. Young people are a big part of this, and the secondary school system in Wales works actively with the community to engage high school students in building up material on the site. 

For Owain and Evans, this is fundamental to their purpose. “We want young people to feel as though the internet’s something that you can interact with,” Evans said. But the U.K.’s new online safety law could take that away. The two surmise that once the bill is enacted, it will be nearly impossible to allow people under 18 to contribute to the site. It could, as Evans put it, “really reinforce the idea that the internet is just a place to get information, that it’s not something you can be a part of.” 

They also worry that the bill’s requirements regarding content could leave contributors fearful of violating the law. “If there’s anything contentious, anything that has adult themes or strong language, no matter how true something might be, or how factual, there will be a concern that if it’s left on Wiki, there’s a risk that young people will see it and we’ll fall foul of the bill,” said Evans. “That in itself does create an atmosphere where you are essentially censoring Wikipedia, and that goes against everything Wikipedia is about.”

It also stings, the two noted, since the U.K. bill was written with the biggest of Big Tech companies in mind. For some reason, its authors couldn’t be persuaded to make a carve-out for projects like Wikipedia. But Owain has some hope that Welsh people and the Welsh government — a Labour party-dominated legislature that does ultimately answer to the British parliament — just might have something to say about it. 

“I should think the whole of Wales would stand up as one and say, ‘Oh! We will access Wikipedia!’ and the Welsh government will support it,” Owain said, raising a fist in the air. I hope he’s right.

Pro-Palestinian messages are getting shadowbanned and horribly mistranslated on social media. Over the past two weeks, multiple journalists, artists, Instagram influencers and even New York Times reporter Azmat Khan reported that their posts containing words like “Palestine” and “Gaza” simply weren’t reaching followers. To make matters worse, a handful of Instagram users found that the platform was spontaneously inserting the word “terrorist” into its machine-translations of the word “Palestinian” from Arabic to English. This reminds me of 2021, when the Al-Aqsa Mosque in Jerusalem was mistakenly labeled as a “dangerous organization” by the same platform. The takeaway here is that Meta, Facebook and Instagram’s parent company, has told its computers to use things like the U.S. government’s list of designated terror groups in order to identify content that could spark violence. This might sound reasonable on the surface, but when you throw in a little artificial intelligence and some plain old human bias, it can get ugly.

Meta has a long history of mistreating speech about Palestine, and while the company is always quick to blame the tech (it’s a “glitch,” the execs say), the evidence suggests that it is not that simple. Between the U.S. government’s list of designated terror groups, Meta’s own list of “dangerous individuals and organizations,” the EU’s Digital Services Act, soft pressure from the U.S. and Israel alike, and a set of community standards that seems to get more complicated by the day, it seems like the decks are stacked against Palestinians who are just trying to say what they feel right now. I will keep my eyes peeled for further “glitches” in the weeks ahead.

Venezuela saw a smattering of web outages over the weekend, during  the political opposition’s presidential primary election, the first to be held since 2012. This was no ordinary vote — public trust in the country’s electoral system is extraordinarily low, due to a history of election fraud allegations and the ruling United Socialist Party’s routine efforts to block bids by its opponents. Opposition organizers created an independent entity, the National Primary Commission, to oversee the election and set up polling places in churches and at people’s homes, rather than using publicly managed buildings like schools and community centers. Over the weekend, the network monitoring group NetBlocks documented huge drops in connectivity in Caracas, and Venezuela Sin Filtro, a censorship monitoring group, reported that websites which listed polling places were inaccessible on most telecom networks. The group also presented evidence that the systems used to count the votes — an estimated 1.5 million people cast their ballots, both inside and outside the country — were hit with cyberattacks. Out of a crowded field, María Corina Machado, a conservative former lawmaker, had won more than 90% of the votes counted by mid-week.

Apple has a problem with Jon Stewart. Last week, the cherished TV comic abruptly canceled the third season of “The Problem with Jon Stewart,” his show on streaming service Apple TV, after the company reportedly pushed back on the script for an episode in which he planned to discuss AI and China. We don’t hear much about Apple in stories about content control and Big Tech, but between the App Store, Apple TV and Apple Podcasts, the company has a huge amount of discretion over what kinds of media and apps its users can most easily access. And when it comes to China — home to the Foxconn factory where half of the world’s iPhones are manufactured — the company has often been quick to bow to censorship demands. There’s been no further information about what exactly Stewart had planned to talk about, but it’s easy to imagine that it might have had Apple’s overlords worried about offending their Chinese business partners.


  • My friend Oiwan Lam, an intrepid Hong Konger who has kept her ear to the ground and her finger on the pulse of the Chinese internet through all the political ups and downs of the past decade, translated a fascinating exclusive interview by a YouTuber known as Teacher Li with a censorship worker from mainland China. Give it a read.
  • In a new essay for Time magazine, Heidy Khlaaf, who specializes in AI safety in high-stakes situations, says we should regulate AI in the same way we do nuclear weapons.
  • The fraud trial of Sam Bankman-Fried, founder of the cryptocurrency exchange FTX, is now well underway in New York. This piece in The Ringer puts you right in the courtroom.