WASHINGTON (AP) — Twitter loves lies. A new study finds that false information on the social media network travels six times faster than the truth and reaches far more people.
And you can't blame bots; it's us, say the authors of the largest study of online misinformation.
Researchers at the Massachusetts Institute of Technology looked at more than 126,000 stories tweeted millions of times between 2006 and the end of 2016 — before Donald Trump took office but during the combative presidential campaign. They found that "fake news" sped through Twitter "farther, faster, deeper and more broadly than the truth in all categories of information," according to the study in Thursday's journal "Science."
"No matter how you slice it, falsity wins out," said co-author Deb Roy, who runs MIT's Laboratory for Social Machines and is a former chief media scientist at Twitter.
Twitter funded the study but had no say in the outcome, according to the researchers.
The scientists calculated that the average false story takes about 10 hours to reach 1,500 Twitter users, versus about 60 hours for the truth. On average, false information reaches 35 percent more people than true news.
While true news stories almost never got retweeted to 1,000 people, the top 1 percent of the false ones got to as many as 100,000 people.
And when the researchers looked at how stories cascade — how they link from one person to another like a family tree — false information reached as many as 24 generations, while true information maxed out at a dozen.
Concern over bogus stories online has escalated in recent months because of evidence the Russians spread disinformation on social media during the 2016 presidential campaign to sow discord in the U.S. and damage Hillary Clinton.
Social media companies have experimented with using computer algorithms and human fact-checkers to try to weed out false information and abuse online. Twitter earlier this month said it is seeking help from outside experts to better deal with the problem. And Facebook this week announced a partnership with The Associated Press to identify and debunk false and misleading stories about the midterm elections.
"We have witnessed abuse, harassment, troll armies, manipulation through bots and human-coordination, misinformation campaigns and increasingly divisive echo chambers," tweeted Twitter co-founder and CEO Jack Dorsey. "We aren't proud of how people have taken advantage of our service, or our inability to address it fast enough."
The MIT study took the 126,285 stories and checked them against six independent fact-checking sites — snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com and urbanlegends.about.com— to classify them as true, false or mixed. Nearly two-thirds were false, just under one-fifth were true, and the rest were mixed.
The six fact-checking websites agreed with each other on classification at least 95 percent of the time, plus two outside researchers did some independent fact-checking to make sure everything was OK, said co-author Sinan Aral, an MIT management professor.
Lead author Soroush Vosoughi, an MIT data scientist, said the three false stories that traveled the farthest and fastest were about a Muslim guard called a hero in the Paris bombings of 2015; an Iraq war veteran finishing as runner-up to Caitlyn Jenner for an ESPN courage award ; and an episode of "The Simpsons" that had a story line in 2000 about a Trump presidency. (It was in 2015.)
University of Pennsylvania communications professor Kathleen Hall Jamieson, a co-founder of factcheck.org, had problems with the way the study looked at true and false stories. The MIT team characterized a story's truth on a 1-to-5 scale, with 1 being completely false. Factcheck.org, Jamieson said, looks more at context and does not label something either true or false.
She also suggested that calling this bogus information "false stories" does not capture how malignant it is. She said it would "better be called viral deception. VD. And treated as analogous to venereal disease."
The researchers looked at obvious bots — automated accounts — and took them out. While the bots tweeted false information at a higher rate than humans, it wasn't that much of a difference, and even without bots, lies still spread faster and farther, Roy said.
David Lazer, a political and computer scientist at Northeastern University who wasn't part of the study but wrote an accompanying report, praised the MIT research but said the scientists may have missed a lot of bots and cyborgs — sort of in-between humans. His ongoing, not-yet-published research has found that about 80 percent of false stories come from just one-tenth of 1 percent of users.
The researchers dug deeper to find out what kind of false information travels faster and farther. False political stories — researchers didn't separate conservative versus liberal — and stuff that was surprising or anger-provoking spread faster than other types of lies, Aral said.
"Falsehood was significantly more novel than the truth," Aral said. "It's easy to be novel when you make things up."
That fits perfectly with previous research on the psychology of fake information, said Yale University's Dan Kahan and Dartmouth College's Brendan Nyhan, scientists who study the phenomenon.
"The more strange and more sensational the story sounds, the more likely they are going to retweet," Kahan said.
Nyhan and Lazer said that while more fact-checking and education of people on how to tell fake from real can be helpful, the more effective solution will have to come from the social media platforms themselves.
Roy said the study results reminded him of the often-cited quotation that essentially says a lie can travel halfway around the world before the truth can get its boots — or pants — on. It's been attributed to Mark Twain and Winston Churchill. But that would be misinformation. Politifact traced a version of it back to Jonathan Swift in 1710.—SETH BORENSTEIN (AP)
Leave a Reply