Xi leaves no doubt that he is on Team Putin

Natalia Antelava

 

It’s taken a year, but Beijing has finally picked a side in Russia’s war against Ukraine. Beijing’s 12-point Ukraine peace plan does not use the word “war,” does not call on Moscow to withdraw its troops from Ukraine, but does condemn what it calls “unilateral sanctions” against Russia. Senior U.S. officials say they are confident that Beijing is considering providing lethal equipment to Moscow, including drones and ammunition. 

Beijing is already providing Moscow with crucial propaganda support. Chinese media fuels Putin’s narrative that Ukraine started the war. “Putin states Russia is Invincible” was one of the best performing hashtags across Chinese social media after it was initiated by the state media outlet Global Times following Vladimir Putin’s speech last week. Beijing’s involvement in the war, whatever shape it takes, could seriously alter the course of the conflict. Which is why, Ukraine, clearly aware of the delicate balancing act it needs to play, is downplaying China’s apparent siding with Moscow and U.S. allegations that Beijing is preparing to send weapons. Ukrainian president Volodymyr Zelenskyy says he plans to meet with Xi Jinping to discuss Beijing’s peace proposal. But Chinese officials have yet to make any public comments about a meeting with Zelenskyy. They have instead invited Putin’s closest ally, Belarusian President Alexander Lukashenko, to visit Beijing. 

Xi Jinping is rolling out the red carpet for Lukashenko. His four-day state visit, starting today, is a big deal, another sign of China’s growing closeness to Russia. Belarus is a party to the war, having allowed Russia to use Belarusian territory to start its initial incursion into Ukraine. Recently leaked documents show that Vladimir Putin aspires to absorb Belarus into a “Union State” by 2030. Details of Lukashenko’s meetings in Beijing and what’s on the agenda are blurry, but the timing of his visit and China’s unveiling of its “peace plan” and its continued support for Russian propaganda show that Beijing is now firmly on Putin’s side.

NATO countries are no longer our opponents, they are our enemies, Kremlin spokesman Dmitry Peskov told Russian state television this week, reflecting on Vladimir Putin’s speech and his decision to end the last nuclear treaty between the United States and Russia, which effectively also ended arms control as we know it. Russia’s global disinformation machine, in the meantime, seems focused on unfounded claims of chemical attacks by Ukrainian forces.  

As Russian artillery pounds eastern Ukraine, the Kremlin is rehashing the old, discredited accusation that Ukrainian forces are preparing to use chemical weapons. Throughout the past year, Russia has repeatedly warned that Ukraine might use non-conventional weapons, including biological weapons or even a radioactive dirty bomb. The claims, labeled a “false flag” operation by Western intelligence, have never materialized. Still, the “news” that Americans have shipped chemical weapons to Ukraine, which apparently arrived in Kramatorsk on February 10, is all over Russian state media and spreading like wildfire across social media

WHY DISINFORMATION IS ABOUT ‘SHITTY ENGINEERING’

So much of the disinformation that we track in this newsletter is only potent because of the unprecedented reach that it now enjoys. People, especially those in power, have always lied in pursuit of their agendas. What has changed is that the lies now spread so quickly and so far. 

For instance, the question of whether or not Ukraine is using chemical weapons should never be subject to debate on Twitter. It should be relatively straightforward to establish this truth as a starting point of any discussion. But our inability to agree on even the basic facts is rendering us useless when it comes to producing solutions to crises — be they global warming, the war in Ukraine or vaccine disinformation. 

Is this trend reversible? I spent last week with thousands of people pondering this very question in Paris, at a conference titled “Internet for Trust.” I often find myself at gatherings designed to brainstorm ways out of disinformation rabbit holes but the Paris conference was something else: 4,000 participants from all over the world gathered at UNESCO’s brutalist headquarters. The goal of the conference was to feed into the global guidelines that UNESCO is trying to put together in order to help governments to regulate online platforms and the disinformation that flourishes on them. 

During breaks between sessions, I overheard many delegates questioning the entire exercise.  After all, the guidelines the conference was feeding into were not exactly enforceable. “What good does it do for me, if Facebook can carry on spreading lies in my country,” a government regulator from an African country exclaimed after a session during which a Meta representative used free speech as an argument against regulating Big Tech. 

Much of the discussion felt like going in circles: How do we regulate platforms if regulators don’t understand the tech? How do we do it without affecting free speech? Is content moderation a solution? But how many content moderators does it take to moderate billions of internet users? I have heard all of these questions posed again and again, without much by way of answers. 

It only ever takes one person to say that the emperor has no clothes and in this case it was Christopher Wylie, a data scientist, Cambridge Analyitca whistleblower and author of the fantastic book called “Mindf*ck: Cambridge Analytica and the Plot to Break America” who stirred up a debate during a panel on content curation and moderation that I chaired. 

“The fact that you need content moderation shows that you are dealing with design failure,” he told the panel that included representatives from TikTok, the Meta Oversight Board, the African Union and a prominent U.S. human rights lawyer. 

Wylie said the key piece missing from all the debates around regulating platforms to counter disinformation is the fact that at the heart of the crisis is “shitty engineering.” 

“I find a lot of these conferences frustrating,” he said, “because so often we talk about things like transparency, content moderation, the role of journalists and it’s all very important but we don’t address the fundamental problem that a lot of these issues flow from the shitty engineering.” 

“If you look at how we regulate every other technological sector,” Wylie argued, “whether it’s aerospace, pharmaceutical or automotive, we regulate through the prism of safety and harm prevention first. That’s how we regulate technology. But when it comes to social media platforms, everyone seems to have moved away from the fundamental line of inquiry: Why are you designing platforms like this? Why is it that there are more safety regulations for the toaster in your kitchen than for platforms that touch the lives of billions of people?”

If, like me, you are obsessed with the subject, I really recommend watching the entire session with Chris Wylie at the UNESCO conference in Paris — you can find it here, starting at around the nine-minute mark from the February 23 recording. And if you want to dig further still, check out Chapter 3, which he authored, of this 2020 report from the Forum on Information and Democracy. 

Why are we going in circles when it comes to countering disinformation on tech platforms? Why aren’t arguments like those made by Wylie cutting through the noise?

One reason: money. Big Tech has invested heavily in lobbying efforts while also announcing major initiatives to support newsrooms, and especially local newsrooms in the United States. Four years after Meta (then Facebook) announced its three-year, $300-million commitment to global “news programs, partnerships, and content,” Tow Center’s Gabby Miller looks at where the money actually went. Spoiler alert: “tracing this funding” was “surprisingly difficult.”