When Mark Zuckerberg announced Meta would terminate its major DEI initiatives – from hiring practices to supplier diversity – just days after abandoning fact-checking, he wasn’t just bowing to the “changing legal landscape” his memo cited. He was declaring victory in a much bigger power grab.
For years, Silicon Valley’s tech moguls have systematically engineered a world where truth is optional, equity and justice are expendable, and facts are toxic waste. By dismantling both fact-checking operations and DEI programs, Meta stands to save millions – with DEI programs already facing cost-cutting measures in 2023, the move shows that the only responsibility Zuckerberg appears to take seriously is the bottom line. The surprising part isn’t that Meta has stopped pretending to care about anything but their power and profit – it’s that we were ever naive enough to believe they did.
The consequences of this decision will play out globally, and few understand those consequences better than Maria Ressa. The founder of Rappler, was among the first to document how social media platforms enabled the rise of authoritarianism in her native Philippines, where Facebook became so dominant that it “rewired our people’s brains.” Ressa, a 2021 Nobel Peace Prize laureate. says “propaganda is like cocaine – you take it once or twice, you’re okay. But if you take it all the time, you become an addict. And we are all addicts today.” When she spoke these words at Coda’s Zeg Festival last June, they felt like a warning. Now they read like a prophecy fulfilled.
Her warning wasn’t just about addiction to propaganda – it was about the deliberate architecture of our digital world. “These tech companies are engineering a world without facts,” she says, “and that’s a world that’s right for a dictator.”
The Engineering of Chaos
Tech pioneer Judy Estrin frames the problem in stark infrastructural terms: “Digital platforms mix ‘digital water’ and ‘sewage’ in the same pipes, polluting our information systems and undermining the foundations of our culture, our public health, our economy and our democracy.”
This pollution isn’t accidental – it’s a feature, not a bug. Meta’s announcement, coinciding with Elon Musk’s open championing of far-right movements in Europe, reveals a profound transformation in Silicon Valley. Tech moguls who once felt pressured to champion openness and truth are now racing to shed any pretense of responsibility.
It isn’t just about catering to Donald Trump and the sentiments of his followers. The shift is about how tech companies view their stakeholders. Where platforms once felt compelled to respond to pressure from employees, users, and advertisers concerned about digital pollution, they’ve now consolidated power solely around profits. The workforce that once served as a guardrail for online behavior has been neutralized – a trend Elon Musk pioneered when he bought Twitter. And Meta’s move to end its ‘Diverse Slate Approach’ to hiring and representation goals, while adding Trump allies like the Ultimate Fighting Championship supremo Dana White to its board, shows exactly where power now lies.
The Infrastructure of Authoritarianism
For years, we’ve analyzed electoral manipulation, documented democratic backsliding, and tracked the rise of strongmen while treating platforms as mere conduits rather than active architects of our political reality. The entire debate around content moderation appears in retrospect to have been a carefully crafted distraction – a game of Whack-a-Mole that kept us focused on individual pieces of content rather than the systemic nature of the problem. As one former Meta employee said, “It’s like putting a beach shack in the way of a massive tsunami and expecting it to be a barrier.”
“Facebook’s ‘fact checking’ initiative was at heart always a PR exercise,” argues Emily Bell, whose research at the Tow Center at Columbia University focuses on the intersection of platforms, media and information integrity. “Nothing has changed about the platform’s mission: to make money from the exploitation of IP and data created for free.”
By abandoning civic responsibility, while disingenuously claiming to be acting in the interests of free speech, Zuckerberg and Musk aren’t so much transforming their platforms as finally being honest about what these platforms have always been: engines of engagement designed to maximize profit and power, regardless of societal cost. The real shift isn’t in their behavior – it’s in our belated recognition that no meaningful conversation about democracy can exclude the role of the broligarchy in shaping our information ecosystem.
The Future of Truth
Tech platforms have wielded the First Amendment much like the gun lobby has wielded the Second: turning constitutional protections into a weapon against regulation and accountability. Just as gun manufacturers claim they bear no responsibility for how their products are used, platform owners insist they’re merely providing neutral spaces for free expression – all while their algorithms amplify lies and fuel society’s most self-destructive impulses.
And we are all complicit. The endless scrolling of TikTok and Instagram, the ease of WhatsApp communications, the ability to instantly connect with friends and family across the globe – these aren’t just corporate products, they’re now fundamental to our daily lives. But in our embrace of this convenience, we’ve sleepwalked into a future where the rejection of facts isn’t just the domain of authoritarian governments in Moscow or Beijing, but of giant tech companies in Silicon Valley.
Many respected journalists and human rights defenders lent their credibility to Meta’s Oversight Board – a body that could review a handful of content decisions but had little effect on the platform’s fundamental design or business model. “The Oversight Board is absolutely the wrong problem [to address],” Ressa says. “They tried to call it the Supreme Court for content. Content is not the problem. The distribution and the rate of distribution is the problem. The design of the platform, none of which they have any power over. But yet they were able to get very credible people.”
The Path Forward
The solutions we’ve pursued – from fact-checking initiatives to content moderation boards – have been mere band-aids applied on a deep systemic wound. As platforms poured millions into lobbying and institutional capture (Meta spent $7.6 million on lobbying the U.S. government in just the first quarter of 2024), we settled for superficial fixes that left their core business model untouched. As long as news organizations treat platforms as mere distribution channels rather than existential threats to information integrity, we will remain trapped in a cycle of ineffective half-measures.
Before journalists point fingers solely at tech platforms, we should also look in the mirror – especially those of us who’ve made careers out of dealing in facts and telling truths. Journalists, researchers, scientists, educators – we’re all part of this story. While tech platforms may at some point be regulated (though good luck with that during a Trump administration), we need to get real about our own role in this mess.
While we must figure out how to work toward systemic change, there is still power in how we choose to engage with these platforms. Every scroll, every share, every moment of attention we give is a choice. By being more conscious about where we get our information, how we verify it, and most importantly, how we pass it on, we can start reclaiming some control over our information environment. Small individual actions – from supporting independent journalism to thinking twice before spreading unverified content – add up to collective resistance against a system designed to exploit our worst impulses.
Those of us in journalism and media must also ask ourselves: Have we been complicit in providing cover for systems we knew were fundamentally broken? Have we prioritized our convenience and digital reach over the integrity of information? Most importantly, are we ready to acknowledge that our industry’s survival, and arguably that of democracy overall, depends on confronting these platforms’ role in undermining the very foundations of factual discourse?
The answers to these questions will determine whether we can rebuild an information ecosystem that serves society rather than corrodes it. But the very first step towards a society in which facts matter and truth has value, is admitting that the destruction of truth wasn’t an accident – it was by design.
*Disclosure: Maria Ressa serves on Coda’s Board of Directors.