On a sunny morning in Taipei last August, I joined a few dozen other people at the headquarters of the Kuma Academy for an introductory course in civil defense. We broke into groups to introduce ourselves. As our group leader presented us to the room, she mistakenly called me a “war correspondent.”

Why did we write this story?

Taiwan is a pioneer in digital defense and tech-enabled civil society. How it handles an onslaught of Chinese disinformation could set the standard for other democracies.

“No, no, that’s not right,” I interjected. “I’m here because I precisely don’t want to become a war correspondent in the future.” 

The Kuma Academy, established in September 2022, trains citizens in the basic skills they might need to survive and help their compatriots in the event of an attack. Civil defense has been on many people’s minds in Taiwan since Russia’s full-scale invasion of Ukraine in 2022. “If China Attacks,” a book covering potential scenarios for a Chinese invasion — co-written by Kuma Academy co-founder Puma Shen — has become a bestseller. 

Many of the attendees at the academy seem like regular office workers or homemakers. The youngest person I talk to is a high school student. A great deal of the curriculum is practical — basic medical training, contingency planning for an invasion, even what kind of material you should hide behind to protect yourself from gunfire. But a lot of the training is less about skills and more about shoring up the sense of agency that regular people feel: making them understand that they have the power to resist.  

In the face of Chinese propaganda and disinformation, that could be as important as weapons drills and first aid. Taiwan holds elections this month, pitting the pro-autonomy Democratic Progressive Party (DPP) against the more pro-Beijing KMT. The outcome of the vote has huge consequences for relations across the Taiwan Strait and for the future of an autonomous Taiwan, whose independence Beijing has vehemently opposed — and threatened to violently reverse — since the island first began to govern itself in 1949. Successfully interfering in the democratic process using what the Taiwanese government calls “cognitive warfare” could be a way for Beijing to achieve its goals in Taiwan without firing a shot. 

Despite — or because of — the stakes, Taiwan’s response to the challenge of Chinese election interference isn’t siloed in government ministries or the military. Just as civil resistance has to be embedded in society, the responsibility of defending the information space has been entrusted to an informal network of civil society organizations, think tanks, civilian hackerspaces and fact-checkers. 

“We’re often asked by international media if Taiwan has an umbrella organization for addressing disinformation-related issues. Or if there is a government institution coordinating these kinds of responses,” said Chihhao Yu, one of the co-founders of Information Environment Research Center (IORG), a think tank in Taiwan that researches cognitive warfare. “But first, there’s no such thing. Second, I don’t think there should be such an institution — that would be a single point of failure.”

A girl learns how to do CPR during an event held by Taiwanese civil defense organization Kuma Academy, in New Taipei City on November 18, 2023, to raise awareness of natural disaster and war preparedness. I-Hwa Cheng/AFP via Getty Images.

Disinformation from China is hardly new in Taiwan. During the Cold War, before the term “disinformation” was in the common lexicon, the Chinese Communist Party injected propaganda into the public sphere, trying to instill the idea that reunification was inevitable, and it was futile to resist. This is spread through many channels, including newspapers, magazines and radio. But, as in the rest of the world, social media has made it easier to reach a wide audience and spread falsehoods more rapidly and with greater deniability. Disinformation now circulates on international platforms including Facebook, Instagram, X and the South Korean-owned messaging app Line, which is popular in Taiwan, as well as on local forums such as PTT and DCard.

Disinformation from China used to be easy to spot. Its creators would use terms that weren’t part of the local Taiwanese lexicon or write with simplified Chinese characters, the standard script in mainland China — Taiwan uses a traditional set of characters instead. However, this is changing, as information operations become more sophisticated and better at adapting language for the target audience. “Grammar, terms, and words are more and more similar to that of Taiwan in Chinese disinformation,” said Billion Lee, co-founder of the fact-checking organization Cofacts.

With the election approaching, the Chinese government has increased its efforts to localize its propaganda, recruiting social media influencers to spread its messaging and allegedly buying influence at the grassroots level by subsidizing trips to China for local Taiwanese politicians and their constituents. Over 400 trips took place in November and nearly 30% of Taipei’s borough chiefs — the lowest level of elected officials — have participated in them. 

The medium used to spread propaganda and disinformation has evolved as well. Cofacts started out in 2016 by building a fact-checking chatbot on Line, focusing on text-based falsehoods. Now, it has to work across multiple platforms and formats, including TikTok reels, Instagram stories, YouTube shorts and podcasts.

The aim of this election disinformation is often fairly obvious — boosting Beijing’s preferred candidates and discrediting those it considers hostile. 

In late November, 40 people were detained by Taiwanese authorities on voting interference charges. A separate investigation found a web of accounts across Facebook, YouTube and TikTok that worked to prop up support for the pro-China KMT. The so-called “Agitate Taiwan” network also attacked third-party candidate Ko Wen-je, whose party favors closer relations with China, but whose candidacy may divide the vote in a way that leads to a victory for the historically independence-leaning DPP. 

Other themes, Lee said, include trying to undermine the DPP leadership and casting them as inept by insinuating, falsely, that they failed to secure vaccines during the Covid-19 pandemic, and alleging that the DPP only pushed for the development of Taiwan’s domestically produced vaccine, Medigen, because it had made illicit investments in the company. Messaging also often targets Taipei’s relationship with the U.S., suggesting that America would abandon Taiwan in the event of a war.

These overtly political messages intersect with other influence operations and more traditional espionage. In November, 10 Taiwanese military personnel were arrested after allegedly making online videos pledging to surrender in the event of a Chinese invasion. One of those charged, a lieutenant colonel, was allegedly offered $15 million by China to fly a Chinook helicopter across the median line of the Taiwan Strait to a waiting Chinese aircraft carrier. Such defections and public promises not to resist, weaponized and spread on social media, are clearly aimed at undermining public morale in Taiwan. 

Those efforts can be oddly targeted. In May, Cynthia Yang, the deputy secretary-general of a nonprofit in Taiwan , received a series of calls from people with mainland Chinese accents after she ordered a copy of “If China Attacks” from the Taiwanese bookseller Eslite. The callers claimed to be from customer service, but they questioned Yang about her “ideologically problematic” purchase. It seemed to be an effort at psychological intimidation. After the incident was reported on by Taiwanese media, the book’s co-author Puma Shen quipped on social media that his next book would be titled “If China Calls.”

Fighting back against this full-spectrum influence campaign is hard. Chinese disinformation tactics have fed into a broader polarization in Taiwan, which is fragmenting the internet.  “Everyone uses a different internet these days,” Lee said. There’s increasing recognition online that people inhabit echo chambers comprising their peers, which are difficult to break out of. 

It means that the organizations — mainly civil society groups — arrayed against a superpower keen on undermining Taiwan’s democratic processes face a complex task.  Often these groups are small and scrappy, run by volunteers or just a handful of staff. They’re in an arms race that they can’t win — or at least, that they can’t win alone.

To compete, they’re collaborating. “Even if we don’t know each other, we can work together without directly cooperating,” said Yu from the Information Environment Research Center. “To use Cofacts as an example, we don’t directly coordinate with Cofacts. But because Cofacts has an open database with an open license, we can use their datasets of rumors and community fact-checking to conduct research, and we continue to do so.”

Cofacts has emerged as an important piece of infrastructure for Taiwan’s fact-checking ecosystem. The organization has used its Line bot as a way to build an enormous database of disinformation spotted in the wild, which it makes available to other groups via an application programming interface. Crucially, the bot allows users to collect disinformation that wasn’t circulating on open social media, such as Facebook or Twitter, but in closed-door messaging apps such as Line or Facebook Messenger. 

Systematically collecting that data allows other organizations to conduct more sophisticated analysis, spot patterns and respond strategically, rather than chasing down every lie and fact-checking it.

This collaborative approach can be traced back to g0v, the influential civic hacker community, from which a number of innovative initiatives have emerged in the past decade — from digitizing historical documents significant to contemporary Taiwanese politics to gamifying the identification of satellite images to find illegal factories on farmland. 

The g0v community runs decentralized hackathons for developing project ideas , taking place in classrooms and offices and bringing together anywhere from a few dozen to a few hundred people. Not all ideas make it to fruition, but some of the projects that come out of g0v — including those that tackle disinformation — may begin with just a small breakout group huddled in the corner of a hackathon.

It is these small civil society groups that Taiwan relies on to stay ahead of Chinese innovations in disinformation, with the hope that by being nimble and adaptable, they can hold back the tide. Bigger threats are coming. The rise of generative artificial intelligence, which can quickly create text, images, videos and more at scale, could allow China to increase the volume of propaganda it produces and make it seem more authentic by accurately using Taiwanese idioms and references. Certainly, there is no shortage of materials produced out of Taiwan’s open and free Internet for generative AI to learn from. 

Still, the solution may be precisely in the decentralized and networked nature of these efforts to combat Chinese disinformation campaigns. After all, a set-up in which a number of differing solutions emerge at once, often organically and spontaneously, has no single point of failure, as to borrow Yu’s words. 

“We wanted to connect people who wrote code and people concerned with society to work together,” Lee said, when asked about why she and her collaborators began Cofacts. Perhaps it’s faith in society to know for itself what’s best that keeps such groups going. And this may be the best weapon against authoritarianism — the belief that the connections between people can be enough to deal with a much larger enemy. The fight is on.

CORRECTION [01/12/2024 09:52AM EST]: The original version of this story stated that 40 people were detained by Taiwanese authorities on voting interference charges in connection to the Agitate Taiwan network. The detentions were not directly related to the network.