The study of psychology fascinates me. I enjoy reading and watching almost everything related to psychology, from landmark discoveries to goofy tests revealing which cake flavor I am — hot fudge double chocolate brownie, by the way.
This week, I came across something altogether more serious: New research about misinformation and how we make our moral judgement.
A new study published in the journal Psychological Science revealed that being repeatedly exposed to false news might make us feel less unethical about spreading it ourselves — and here’s the scary part—even if we know it is false.
The study was conducted in a series of experiments over the last two years. Researchers Daniel A. Effron, a London Business School associate professor, and Medha Raj, a PhD student at the University of Southern California, presented 12 actual fake news headlines about American politics to more than 2,500 participants.
The experiments revealed that people thought it less unethical to share what they knew were false headlines, compared with sharing those they hadn’t encountered before.
Numerous studies have confirmed that previously encountered information feels more familiar and therefore accurate. Psychologists refer to it as the illusory truth effect. But Effron and Raj in their research say what they have found “represents the first evidence that misinformation receives less moral condemnation when it has been repeated.” Therefore, people are more prone to promote it themselves on social media platforms, for example.
Why is this important?
“Efforts to curb misinformation typically aim to help people distinguish fact from fiction,” write the authors. “We suspect that these efforts will be insufficient as long as people find it more morally permissible to share previously encountered (vs.new) information they know is false.”
As online platforms like Google, Facebook and Twitter face increased levels of scrutiny, they have introduced some measures to help consumers detect false news or contextualize individual facts found online. Some of these initiatives include Facebook’s fact checkers (although some of them are leaving), Google’s media literacy projects and other tools which assess truthfulness.
But it seems that distinguishing fact from fiction isn’t enough to curb disinformation. We often discuss how false news is spread, but our conversations lack depth in uncovering the “when” and the “why” of the subject. Raj and Effron’s findings are a first step towards understanding people’s decision to share lies even when they don’t completely believe them.
“Coordinated inauthentic behavior” beyond social media
One organization which has done some work on why investigations about disinformation need to broaden their focus beyond how social media is used for propaganda is the Australian Strategic Policy Institute.
ASPI is an independent, non-partisan think tank based in Canberra, Australia, which attempted to demonstrate our vulnerability to disinformation.
Elise Thomas, a researcher at ASPI’s International Cyber Policy Centre, showcased the online life cycle of a bogus press release about a purported plan by China to assassinate President Trump, his family and some members of Congress. While the story sounds ludicrous—that China is planning to use poison-filled dragonfly drones to assassinate a world leader—Thomas wanted to demonstrate the fallibility of digital advertisers and digital distribution services that disseminate content across platforms.
Thomas observed that this made up story was spread by one of the distribution services and appeared on dozens of “junk news” sites. More importantly, it also ended up on some legitimate local news outlets, like the Denver News Journal and ABC 8, posing as a real news piece.
“In the context of an organised campaign, sowing disinformation across junk news and second-tier news sites would be an effective first step for laundering false facts and narratives into social media and then mainstream media, without the investment or hassle of setting up a new fake news website,” writes Thomas.
You might think that a simple Google search could clarify such falsehoods. At the very least, it would verify if other newsrooms reported the story. But since multiple newsrooms published the release, for most readers, the credibility of the bogus story increased.
The Coda story I’m reading:
If you’d like to read more about why people spread falsehoods online, read Eduard Saakashvilii’s article about a recent book which examines the real life impact of online behavior.
The book, “Memes to Movements: How the World’s Most Viral Media is Changing Social Protest and Power” is written by An Xiao Mina. “It’s tempting,” she writes, “to think of fake news as a series of falsehoods.” She goes on to argue that fact-checking does nothing to address the fundamental reasons people decide to share lies.
In case you missed it:
- Twitter influencers in Pakistan are using coordinated online campaigns to manipulate public opinion, according to an investigation on how nationalist groups can successfully weaponize Twitter’s “trending” topics section. (Dawn)
- In the runup to the UK elections next week, political campaigners are using “parody” sites to grab voters’ attention and counteract opposition party criticism. (First Draft)
- A new investigation reveals how a pro-government information operation in the Philippines attacked media covering the Southeast Asian Games last month. (Rappler)
The story you just read is a small piece of a complex and an ever-changing storyline we are following as part of our coverage. These overarching storylines — whether the disinformation campaigns that are feeding the war on truth or the new technologies strengthening the growing authoritarianism, are the crises that Coda covers relentlessly and with singular focus. We work with dozens of local and international reporters, video journalists, artists and designers to bring you stories you haven’t seen elsewhere, provide you with context missing from the news cycle and illuminate the continuity between the crises we cover. Support Coda now and join the conversation with our team. No amount is too small.