Facebook’s fatal failures in Ethiopia, terrorism on YouTube and the not-so-new problem of toxic Twitter

Ellery Roberts Biddle


Meareg Amare, an Ethiopian chemistry professor, was assassinated last year after people on Facebook called for his killing. War has raged in Africa’s second-largest country for more than two years, taking the lives of over half a million people, and Facebook has been a key organizing space for opposing factions and paramilitary groups in the conflict. This fact is central to a landmark legal action against Meta, Facebook’s parent company, that was filed before Kenya’s High Court last week. A group of tech-savvy legal advocates from Kenya and the U.K. are representing the plaintiffs, who include Amare’s family, and will be arguing that the company systematically underserves users in the Global South by allowing hate speech and threats of violence to spread unchecked. The petition calls on Meta to start “demoting violent incitement” regardless of geography — and notes how quickly the company has responded when chaos ensues at home, like after the U.S. Capitol riots in January 2021. It also proposes that Meta pay steep fines whenever it fails to remove posts that could bring harm to people, a measure that has succeeded in some jurisdictions — see Germany’s NetzDG, a.k.a. “Lex Facebook.” This and similar legal actions (I’m thinking specifically of lawsuits concerning Facebook’s role in the genocide of Rohingya Muslims in Myanmar) might be marking a new chapter in public efforts to hold Big Tech companies accountable for the real-world impacts of their products.

On a related note, here’s a tough question: Should YouTube be held responsible for recommending videos that “radicalize” terrorists? The U.S. Supreme Court is expected to deliver a decision on Gonzalez v. Google in the first quarter of 2023, alongside a ruling in Twitter v. Taamneh, Twitter’s appeal of a case with similar contours. Reynaldo Gonzalez filed suit against Google (the owner of YouTube) after his daughter was killed in a 2015 terror attack in Paris for which ISIS claimed responsibility. Gonzalez says his daughter’s killers were radicalized on YouTube and that the platform’s recommendation algorithm, which constantly suggests new videos for users to watch, played a key role in how this happened. 

“Sorry, but that’s not our fault,” says Google. So far, the $1.16 trillion company has successfully argued that Section 230 of the Communications Decency Act — also known as the “26 words that shaped the internet” — protects it against liability for whatever videos people see when they’re on the site. But what if YouTube’s algorithm suggests that you watch a certain video, and then you decide to kill other people as a result? When the law was inked back in 1996, I’m pretty sure no one was thinking about this possibility. So the court has agreed to hear the case, in a moment when many politicians in the U.S. (and at least one Supreme Court justice, Clarence Thomas) have their sights set on tightening the scope of the law. A smattering of amicus briefs from tech law scholars, engineers and tech industry experts have been filed in recent weeks, promising a stimulating debate around whatever the court decides. We’ll have more on this in the new year.


There seem to be new catastrophes at Twitter every few minutes — hate speech on steroids, media censorship, security vulnerabilities and regulatory doom. Elon now says he wants someone else to take over, but this probably won’t cause any of the big problems at hand to evaporate. After several prominent U.S. journalists had their accounts suspended last week, presumably over their coverage of Elonjet, a few colleagues told me they wanted to quit the platform altogether. With so many people getting censored or going silent, this is fair enough.

But the problems that this special set of mainstream, white liberal Americans are suddenly grappling with are not entirely novel or Muskian in origin. Writing for Thomson Reuters’ Context last week, Pakistani human rights lawyer Nighat Dad, who also sits on Meta’s Oversight Board, remarked on how the dissolution of Twitter’s safety and integrity teams has rattled Westerners. “But in a cruel twist of fate,” she wrote, “it has brought them in step with where the rest of the world has already been—abandoned and ignored by social media platforms.”

Indeed, some very influential voices from communities that have been marginalized on the platform so far are sticking it out. University of Kentucky digital studies professor Kishonna Gray, who has long known just how toxic Twitter can be for women and people of color, is another. Gray and Chris Gilliard, another preeminent Black scholar of tech, wrote this week for WIRED about the work that Black women have done to build community and social movements on the platform, despite the hate and harassment that came with it. “Twitter changed leadership from one mercurial billionaire to another, and in that regard it affirms that the site was never ‘ours,’ and cements how dangerous it is to think of these systems as the ‘public square’ no matter how many times people refer to it as such.”

This is a good reminder. Like Meta, Google and other Silicon Valley heavy-hitters, in the mid-2010s, Twitter built a set of rules and processes that sometimes made it feel as if we users have some rights or were somehow entitled to due process. Of course, it would be great if the companies really wanted to peg their policies to international human rights standards and accept accountability whenever they failed to live up to those standards. And of course it’s worth pressuring them to do the same. But most of the time — absent serious legal challenges like the ones I mentioned above — they don’t have a good enough reason to do this. Like Gray and Gilliard say, these spaces are not a public square. We know that the “agreement” we accept when we sign up can change at any time, on the whim of whoever is in charge.


As the year comes to a close, I’m planning to dial it down, curl up on the couch and enjoy some long reads by my dear colleagues here at Coda. Here are the highlights from our new series on the power and politics of nostalgia.

  • Disinformation may have novel forms in the digital age, but it’s nothing new. In her debut piece for Coda, Fiona Kelliher offers a nuanced portrait of Anlong Veng, a district in northern Cambodia where leaders of the Khmer Rouge are still revered as heroes, and many believe that Vietnam — not the Khmer Rouge — was behind the genocide of roughly a quarter of the population. 
  • I’ve always thought of California as being a world unto itself, where Silicon Valley — the ground zero for so much of what I think about each day — happens to exist alongside things that spark wonder, like the Marin headlands, and bewilderment, like the wildfires that engulf thousands of acres of land each year. This all comes together in Erica Hellerstein’s new essay, a deep reflection on grief and the climate in her cherished and forever-changing home state.
  • “India offered me my external identity, Britain my interior one and Kuwait was the metaphorical suburban bedroom in which I played out my fantasies.” Very few people can honestly say this about themselves, but my colleague Shougat Dasgupta is one of them. He takes us from Bombay to Kuwait to Baghdad in this rich and playful meditation on identity, globalization and a search for home.