Biolabs, QAnon, and Putin: visualizing digital authoritarianism’s next move
Disinformation researcher Marc Owen Jones knows his way around a rabbit hole. He spends his time investigating fake accounts, exposing disinformation networks, and wading through the murky waters of authoritarian influence campaigns. He creates visualizations of these digital worlds to help his followers understand how many people are involved, how they’re connected, and who the biggest players are.
In the last week, he’s been in some truly bizarre corners of the internet. He’s unearthed a network of QAnon influencers who believe Trump’s pronunciation of “China” is really a secret message about Ukrainian involvement in the origins of Covid-19. He’s drilled down into a conspiracy theory claiming there are U.S.-run bioweapon labs in Ukraine. He’s exposed fake Twitter accounts, like one purportedly owned by a Brisbane realtor that’s likely been hacked and transformed into a “Crypto QAnon Fascist” account.
In a conversation edited for length and clarity, Owen Jones talks about how exposing disinformation plays into the hands of bad actors — but someone’s got to do it.
You spend your life studying disinformation patterns and going down some pretty bizarre internet wormholes. How did you get into this world?
I’ve always been very fascinated by the notion of deception — of what is true and what is not. Growing up, my mum had a mental illness that was not diagnosed, but there were delusions involved. And sometimes she’d say things that I knew were demonstrably not true. I think just growing up around that, I became very astute at determining what was far-fetched and what was plausible. I think that subliminally played into it.
When it comes to the Ukraine invasion, what narratives are you interested in at the moment?
I’m kind of obsessed with the biolab one right now.
Us too. We’ve been covering the biolab story for years — because the Lugar Lab, an American-run research lab in Georgia, has been the subject of sustained Kremlin-backed disinformation campaigns for a long time, claiming it’s really waging germ warfare. How has the biolab conspiracy theory come into play during the Ukraine invasion?
An independent Bulgarian journalist — who has actually made a documentary about the Georgian lab — tweeted that the U.S. Embassy in Kyiv had deleted documents about a biological program that they had. Now these documents are actually available so it wasn’t true, I think there were just some broken links. But it was enough to get over 12,000 retweets. And it was amplified by so many people — crucially, by Chinese diplomats.
Chinese state media also picked it up, asking, “What is the U.S. government trying to hide?” And now QAnon is basically saying that Ukraine created Covid. Every time I talk about QAnon I feel like I’m losing my mind.
If one person tweets a crazy conspiracy theory about a biolab — no one cares, right? You’ll ignore it. But if you have one thousand, two thousand people tweeting, and then state officials, you have to address it as a media organization. And then by doing that you’re redirecting resources to discussing something, debunking it, and so on. And suddenly you’re not talking about the invasion of Ukraine or civilian suffering any more. That’s part of the success of any propaganda agent, just getting journalists and researchers to redirect their resources.
Do you think we’d be better off if we just didn’t address these conspiracies at all?
The thing is, we don’t have a choice but to direct resources to it. Because we have to intervene in the information space. Otherwise it would just be a deluge. It’s not like we have control of the information, or can tell the state or bad actors not to use propaganda. Unfortunately, the system that exists is one that facilitates deception. So we do have to step in.
You’ve got a book coming out in the U.K. on April 28 called Digital Authoritarianism in the Middle East, due to be released in the U.S. this summer. At Coda we report on “currents,” and one of these encompasses authoritarian tech. We have our own idea of what that means — but I’m interested in hearing what it means to you.
I see it exactly as the authoritarian use of tech. I would say if you look at digital authoritarianism generally, it includes surveillance, censorship. But it also includes social manipulation and harassment, targeted persecution.
I think when people think of authoritarianism, it’s very state centric — they think of governments using tech. But the authoritarian use of tech is actually multiple actors, both existing in authoritarian regimes and outside them. And the West, or the Global North — whatever you want to call it — comes together to engage in these authoritarian practices too. Pegasus is a good example. Or Lynton Crosby, in London, using Facebook campaigns to whitewash Mohammed Bin Salman. You have these reputable companies in Mayfair, with shiny offices, doing these nefarious kinds of activities.
Tell me a bit more about how disinformation plays a part in authoritarian tech.
Without the information space, digital authoritarianism is kind of meaningless because I think authoritarianism requires not just coercion, it requires attempts to persuade. You could call it cognitive hacking.
I think deception is often a better term than disinformation. Because deception incorporates the means of distributing that information as well as the content. You could, for example, have one thousand bot accounts spreading something truthful, but because they’re bots trying to give the illusion that they’re real, it’s still deceptive. So I think deception is a better, broader term.
How would you describe the traction that the propaganda offensive against Ukraine has gained?
I’d say for the first week or two, the dominant narrative was very clear. It was “Russia has invaded Ukraine. This is egregious. This is horrible.” It felt like everyone was on a similar page.
Then we started to see the #IStandWithPutin hashtag creeping out — which was heavily pushed by inauthentic accounts. And then the narrative began to divide support for Ukraine. There was a concerted attempt to redirect attention away from Russia and Ukraine, by painting the conflict as Russia versus NATO, Russia versus the West.
We were seeing people not necessarily being unsympathetic to Ukraine, but redirecting their attention, focusing on Western hypocrisy as opposed to a Russian invasion.
We’re now seeing the right, and the QAnon accounts, embracing a Trumpist legacy of suspicion of Ukraine. And that taps into what they’re interested in — the idea that the U.S. government is complicit in some shady goings on in Ukraine.
And then we’re seeing relative successes like the biolabs narrative. And gradually, the waters are getting muddied.
The story you just read is a small piece of a complex and an ever-changing storyline that Coda covers relentlessly and with singular focus. But we can’t do it without your help. Show your support for journalism that stays on the story by becoming a member today. Coda Story is a 501(c)3 U.S. non-profit. Your contribution to Coda Story is tax deductible.