Biden AdministrationUkraine InvasionInflationEnergyCybersecurity

Twitter is testing new ways to fight misinformation. Is an open-sourced method the answer?

February 26, 2020, 8:11 PM UTC

Since its launch in 2006, Twitter has taken small steps to fight false information, from uprooting bot accounts to banning political ads. In 2020, the microblogging service is making moves like never before—not only to crack down on false information, but to correct it.

But exactly how they plan to do so remains on the drawing board.

Twitter has been test driving new ways to fight tweets both misinformative (inaccurate, but not necessarily ill-intended) and disinformative (intentionally misleading), according to NBC News. One possible method, according to a series of graphics leaked to the network, is flagging tweets with Snopes-style badges if they’re deemed “harmfully misleading.” 

In one proposed image, fictional journalists rebut nonexistent tweets by presidential candidate Bernie Sanders, author and columnist Anand Ranganathan, and GOP Congressman Kevin McCarthy. As a reward, the accounts would receive small green “community badges” for their efforts.

The problem is that an average Twitter user may not even believe the ultra-stringently fact-checked New York Times. In fact, a 2019 Gallup poll shows that only 41% of Americans trust mass media at all.

“On one hand, I love that when something’s false, we’re finding a way to call it out,” Anthony Shop, the co-founder and chief strategy officer of Social Driver, tells Fortune of Twitter’s efforts. “But on the other hand, I worry that the way psychology and human nature works, that it will actually be self-defeating.”

Shop describes Social Driver as “a digital agency that helps companies connect with people today,” and his projects often involve correcting misinformation about public health issues like vaccines and community water fluoridation.

For example, confirmation bias, meaning the human tendency to favor information that confirms previously held convictions, is highly operative in the spread of falsehoods, according to Shop.

“People make decisions emotionally and then they use facts to back up their decision,” Shop says. “If that’s what people do, why would we think it’s going to be any different?”

But what about a peer-reviewed scientific study or survey? This, Shop insists, could activate confirmation bias even more.

“Sometimes having a person in a white lab coat give the answer actually makes people trust that person less,” he says. “They’re more likely to trust a neighbor or a celebrity that doesn’t have the credentials.”

A possible solution could be showing multiple journalistic viewpoints—CNN versus Fox News versus MSNBC, for example. But Shop says this, too, might not jibe with human nature.

“People are going to reach their own conclusions,” he says. “So you may as well try and attract them.”

An open-source approach

When it comes to Twitter misinformation, Shop says that the secret may lie in open sources, not closed ones. Take Wikipedia, for example. 

While the crowd-collaborative encyclopedia is sometimes dismissed as inaccurate, a 2005 study by Nature found that Wikipedia was about as trustworthy as Encyclopedia Britannica. (For its part, Britannica denied the accuracy of the 2005 study, calling it “error-laden” and “invalid.”)

Regardless, Basile Asti, community ambassador for CaptainFact, a nonprofit and browser extension that fact checks videos, believes that an open-source approach is the best way to go for Twitter.

“A Wikipedia-type approach to fact-checking can address the challenges and criticism linked to institutionalized fact-checking,” Asti tells Fortune. “We believe some of the problems they addressed… are rather similar to the ones we are trying to solve at CaptainFact.”

Shop says it’s problematic to put all our eggs in the basket of a top-down, centralized news institution.

“I think [widely trusted centralized sources] are probably an anomaly in human experience,” he says. “The newspaper landscape of the 1800s was very partisan, like our current cable news and Twitter environment.”

“If we [want] to address the issue of online misinformation, we have to let the citizen enter the arena of fact-checking,” Benjamin Piouffle, the founder of CaptainFact, declared in a Medium post in 2017. “Centralized fact-checkers will never be able to deal with the enormous amount of data produced on the Internet today.”

In the future, “It’s probably going to look a lot more like Wikipedia than it does Walter Cronkite,” Shop says.

But not everyone agrees.

A closed-source approach

Barbara McCormack, the vice president of education at Freedom Forum, favors citing a variety of closed sources over a Wikipedia-style approach.

“I think we need to use cross-representation of sources,” she tells Fortune. “A checks-and-balances kind of system. Open-source people are well-intentioned, but we need to build back trust in our journalists and what they were designed to do, which is digging for the truth on our behalf every day and trying to check themselves for bias.”

McCormack appeared at the “Fighting False News: Strategies to Combat Digital Misinformation” panel in Washington, D.C., in 2018, which Shop moderated. They agree that human nature bends toward biases, but McCormack finds open sources risky—and leaning on one source of information even riskier.

“This system that they [might use], what if someone quotes the Bible? Or the Quran?” she posits.

To be clear, Twitter has not announced that it will use the badge system—and even then, it’s just one of many options on the table.

“We’re exploring a number of ways to address misinformation and provide more context for tweets on Twitter,” a Twitter spokesperson told NBC. “Misinformation is a critical issue and we will be testing many different ways to address it.”

But no matter which angle Twitter chooses to strike from, it’s clear that simply quoting a news agency could be counterintuitive. Maybe, as both Shop and McCormack concur, the fight against falsehood begins with each and every social media user.

“For the love of God, if you don’t know if something’s true, this is what you can really do for the information cycle,” McCormack says. “Don’t like it. Don’t share it.”

More must-read stories from Fortune:

—Are we undergoing an industrial revolution or a phase change?
—Understanding the 2020 election as brand marketing
Investors shouldn’t underestimate election volatility, warns UBS
—Angela Merkel is on her way out. Meet her potential replacements
How the 2020 election could influence your personal finances

Get up to speed on your morning commute with Fortune’s CEO Daily newsletter.