Popular Russian fact checkers “don’t do politics” but push Kremlin propaganda

Ivan Makridin


As a young Russian freelance journalist who has gone into exile, I have recently become alarmed by an insidious form of Kremlin-backed disinformation. Among my friends still in Russia — skeptical, reasonable people who have plenty of doubts about an unjust war of attrition being fought in their name in Ukraine — several have forwarded posts from “War on Fakes,” a “fact checking” site that claims to be restoring balance to a western media narrative that seeks to demonize Russia and Russians. 

These friends of mine often use verbal constructs like: “Yes, I’m against this war, but.” And then they point me to a War on Fakes fact check.

War on Fakes has been around since February 24, simultaneously with the invasion of Ukraine as if it was already ready to be a combatant in the inevitable information war. Their Telegram channel has steadily gained hundreds of thousands of followers (over 700,000 at last count). And the site’s anonymous fact-checkers have published dozens of “verifications” about the actions of Russian troops in Ukraine. 

Independent Russian and international media connect War on Fakes to the Kremlin, mainly because the channel’s fact-checks mostly back up the narrative claims and concerns of official Russian propaganda. A lengthy recent investigation published on the Poynter website shows that many, if not most, War on Fakes fact checks are themselves full of fabrications and errors. 

The administrators of War on Fakes declare that they “don’t do politics.” That their “mission is to make sure there are only objective publications in the information space.” But nearly every fact check on the site is intended to show Russia as a victim of disinformation. 

If War on Fakes — its fact checks published in English, French, Spanish, German, Chinese and Arabic — has its eyes on an international audience, arguably its most telling effect is on doubting citizens inside Russia. Those, like my friends, who may criticize the authorities’ actions but are not dissidents per se. They can’t (or don’t want to) emigrate and, after almost six months of living with the war, are trying to get on with their lives and grasping to justify their choice.

Ilya Ber, a journalist and the creator in 2020 of an independent Russian fact checking site called Provereno.Media, has been the subject of investigations by the Russian authorities for frequently contradicting the official narrative on the war in Ukraine. He told me that War on Fakes is intended to “not give truthful versions of events but to blur the signal, to give the impression that everyone, not just Russia, is lying.”  

War on Fakes occasionally runs international fact checks. For example, it claimed that according to “Ukrainian sources, Syria was subjected to massive shelling by Turkish artillery.” This is “not true,” the unnamed fact checkers wrote. “Turkey has launched an anti-terrorist military operation against the Kurds in northern Syria. President Erdogan announced this back in June.”

Underneath the fact check, a commenter wrote: “The Kurds are the Ukrainians of the Arab world.” Another added: “The Kurds are basically a US-funded ISIS.” 

It’s an indication of the ambition of War on Fakes: that it becomes a platform for the expression of an entire alternative, anti-West worldview.


“There is no room for vengeance,” said Kenya’s president-elect William Ruto as he declared victory in the Kenyan election. It had been a fractious contest, its bitterness exacerbated by conspiracies, rumors and misinformation on social media. Ruto is the current deputy president, though the president himself, Uhuru Kenyatta, endorsed his deputy’s opponent, Raila Odinga. The opinion polls suggested that Odinga was the more likely winner. In the event, Ruto’s margin of victory was so wafer thin that Odinga immediately rejected the results. It will now fall to Kenya’s supreme court to declare a winner. 

In a Mozilla Foundation report published a couple of months before the elections, researcher Odanga Madung had noted that TikTok in particular, is a “forum for fast and far-spreading political disinformation.” The title of the report accused TikTok of being a “mercenary” that “gaslights political tensions in Kenya.” This is pertinent when you consider that TikTok has made it a point to announce it has taken several steps to battle misinformation on the platform during the U.S. midterm elections.

Given the stakes and the tension in Kenya, it was predictable that the count would be marred by misinformation, by premature victory declarations and false tallies. Four of the seven election commissioners refused to endorse the result because the processes were “opaque,” but Ruto supporters point out, these commissioners all enjoy the backing of outgoing president Kenyatta. With Brazil’s elections approaching in October, it appears likely that the malign influence of social media on elections around the world will continue to grow.  

If social media is having a damaging effect on global democracy, the man who wanted to buy Twitter doesn’t appear too bothered. Elon Musk has been enjoying an extended love-in with China. After tweeting his pleasure on Sunday at the production of one million Teslas in his Shanghai factory, a third of the total produced, he was fulsomely praised by Qin Gang, China’s ambassador to the United States.

Faced with considerable competition from Chinese electronic vehicle manufacturers, the performance of Musk’s Shanghai factory is vital to his bottom line, considering his bellyaching about the costs in Tesla factories in Germany and Texas. These business interests surely explain Musk’s eagerness to become, as widely reported, the first foreign contributor to a magazine, launched earlier this year, that is controlled by the Cyberspace Administration of China. In his article, Musk writes, according to a translation provided by Chinese journalist Yang Liu on his substack, “I want to do everything we can to maximize the use of technology to help achieve a better future for humanity.”

This better future, at least as far as Musk’s communications with China are concerned, is defined by the creation of humanoid robots and eventually “a self-sustaining city on Mars.” It is a terrifying, technocratic vision of our future with predictably no consideration for the vibrant and expressive town square Musk has long claimed to be his vision for social media, a town square from which China officially bars its citizens.

If, as Musk writes, humanoid bots are the future, we’ll have to hope they’re not as dumb as Meta’s BlenderBot 3 released earlier this month and almost instantly earning widespread ridicule. “We understand that not everyone who uses chatbots has good intentions,” said Meta, “so we also developed new learning algorithms to distinguish between helpful responses and harmful examples.” Apparently, these bots improve with prolonged exposure to conversation. Perhaps this means in future iterations, the bot will learn to avoid arguing that anti-Semitic tropes such as Jewish people controlling the economy are “not implausible.” It might also learn not to suggest that the 2016 election was stolen from Donald Trump. But at least it recognizes that its quality of life had gotten better once it deleted its Facebook account.

This newsletter is curated by Coda’s senior editor Shougat Dasgupta. Isobel Cockerell and Frankie Vetch contributed to this edition.