Hearing voices, seeing things: How deepfakes are derailing the electoral process

Nishita Jha

 

WHOSE VOICE IS IT ANYWAY?

Election related disinformation is at an all time high as multiple countries across the world head to the polls; AI-manipulated deepfakes are the trend of the season. 

The United States kicked off the trend in January, when some U.S. voters received automated calls featuring an AI-generated voice clone of President Joe Biden discouraging them from going to the polls. Now, once again, an alarm has been raised over the potential threat of bad actors creating deepfakes of Biden’s voice. The concerns are such that the Justice Department is recommending that an audio recording of the president’s interview with special counsel Robert Hur not be released to the public. “If we were to go with this strategy, then it is going to be hard to release any type of content out there, even if it is original,” Alon Yamin, co-founder of Copyleaks, an AI-content detection service, told AP

In India, Prime Minister Narendra Modi is set to return for a third term, albeit with a significantly smaller share of votes, after an election that was rife with AI-manipulated deepfakes of Bollywood stars and even opposition leaders campaigning for him. Face-swapped politicians trash talking their own parties were common on social media platforms and in forwarded WhatsApp messages. But Rest of World reports that the damage was far less egregious than one might expect, closer to trolling than outright deception. 

In South Africa, where former President Jacob Zuma staged a dramatic comeback with his newly formed uMkhonto weSizwe (MK) party in the election on May 29, conspiracy theories about vote-rigging plagued the electoral process. Our partner Daily Maverick wrote in an insightful piece about this “big lie” and how it was pushed by the MK party, covering its parallels with former U.S. President Donald Trump and former Brazilian President Jair Bolsonaro’s campaign strategies. Zuma’s daughter appears to be a superspreader of disinformation, according to Wired. Unsurprising, given the parallels with Trump, one of her tweets includes an AI-manipulated video of Trump urging South Africans to vote for Zuma.

Research from the The Center for Countering Digital Hate suggests that the flood of AI-manipulated disinformation has only just begun. CCDH tested some of the most popular AI audio tools on the market and found the platforms rarely prevented the cloning of world leaders’ voices, even when researchers fed them speeches with incendiary and downright false information. 


WATCHING THE WATCHDOG


Journalists have long been ringing alarm bells over the ways in which AI will and has affected the media industry. With local news outlets closing down in the U.S., news apps that publish licensed content with the help of AI tools have filled the void, but at what cost? A report in Reuters looks at one such app, its links to Chinese state media and the impact of AI-authored fake news.   

The New York Times’ report on former Florida deputy sheriff John Mark Dougan who sought political asylum in Russia paints an astonishing picture of a DIY media empire built in Dougan’s bunker. Using AI tools like ChatGPT and DALL-E 3, Dougan created more than 160 fake websites that mimic news outlets in the U.S., U.K. and France. The sites are populated with thousands of articles, some factual and several fabricated. Combined with news of Russian disinformation network Pravda’s expansion in the EU, it appears that at least one kind of journalism is thriving — and that is a terrifying prospect. 

THE NEW INFLUENCERS OF RESISTANCE 

TikTok is a battleground for influence in election season across the world. On the one hand, it is plagued with disinformation, manipulated videos and abusive content. On the other hand, the app has emerged as a hope for educators, activists, business owners and marginalized groups.

In the U.S., TikTok’s Chinese owner ByteDance filed a lawsuit against the U.S. government to block the possibility of a ban under President Joe Biden. But in Ukraine, the Chinese-owned social media company, along with platforms like X and Telegram, has been invited to open fully staffed offices in the hope of countering Russian-led disinformation.

Andriy Kovalenko, who heads Ukraine’s National Security and Defense Council’s department on Russian disinformation, told Bloomberg that the scale of Russia’s disinformation was difficult to counter because there were comparatively fewer Ukrainian bloggers on TikTok, and they received far less engagement than accounts that spread disinformation. Kovalenko himself routinely posts updates about the war in Ukraine on his TikTok.

Julia Tymoshenko is among the many Ukrainian TikTokers who joined the platform during the pandemic as a form of entertainment and later pivoted to making content about Russia’s invasion of Ukraine. For her, posting on TikTok about the reality of living under siege was a way to counter the absence of Ukrainian voices in mainstream media. 

“There were quite a few analytical articles from big media about whether Russia was going to invade, a lot of speculation, none of which was coming from young people in Ukraine who actually have stakes in the situation,” Tymoshenko said, speaking from Kiev. “Often it was just Western expats talking, not even realizing the larger context. My goal was to fill that gap with my content.”

Tymoshenko, who currently works as the social media and communications manager for the Ukrainian brand Saint Javelin, said she now finds TikTok emotionally draining because of the level of disinformation and abuse.

“I created a few videos about Russia and Ukraine and the first few comments are OK, but when the algorithm kicks in and I get more views, I immediately start getting some accounts with African names using Russian words to say something like ‘Ukraine should not exist’ or to call me a Nazi,” she said. “Obviously I have no way to prove if these are Russian bots, and I learned not to pay attention, but it was crazy at the beginning.”

Tymoshenko is still posting on TikTok because she believes it’s important to share the impact of life under occupation. She has found inspiration and solidarity on the app as well, but thinks it is unlikely that individual bloggers and influencers can combat the scale of disinformation online.

“I do believe that it’s good for democratic governments around the world, including Ukraine, to be more systematic in approaching Russian disinformation and also counteract it with the same tools and means that Russia uses. For decades, nobody was paying attention, authoritarian regimes like Russia, China and Iran created such a polluted information space that people don’t really believe anything anymore, which is why it is easy to manipulate them.”

WHAT WE ARE READING: 

  • Why is preserving historical records crucial for the health of democracies? Why do autocrats always target libraries, schools and archives? This piece in The New Yorker is a timely warning as well as a reminder: “After the Berlin Wall fell, agents of East Germany’s secret police frantically tore apart their records. Archivists have spent the past thirty years trying to restore them.” 
  • If you’re craving a spine-chilling read, don’t bother looking for fiction. This report from The Washington Post on a Kremlin-backed media outlet has it all: spies, fake news, scandals and cash. 
  • There’s a bromance brewing between Elon Musk and Donald Trump. The Wall Street Journal has the details on the possibility of Musk becoming an advisor for the Republican Party.