A study led by USC involving over 2,400 Facebook users indicates that platforms, rather than individual users, have a greater responsibility in curbing the dissemination of false information online.
USC researchers have potentially uncovered the major driving force behind the proliferation of fake news: the structure of social platforms that incentivize users to habitually share information.
The team’s findings, recently published in the journal Proceedings of the National Academy of Sciences, challenge common misconceptions that the spread of misinformation is due to a lack of critical thinking skills in discerning truth from falsehood or biases in political beliefs.
Just 15% of the most habitual news sharers in the research were responsible for spreading about 30% to 40% of the fake news.
The research team from the USC Marshall School of Business and the USC Dornsife College of Letters, Arts, and Sciences wondered: What motivates these users? As it turns out, much like any video game, social media has a rewards system that encourages users to stay on their accounts and keep posting and sharing. Users who post and share frequently, especially sensational, eye-catching information, are likely to attract attention.
“Due to the reward-based learning systems on social media, users form habits of sharing information that gets recognition from others,” the researchers wrote. “Once habits form, information sharing is automatically activated by cues on the platform without users considering critical response outcomes, such as spreading misinformation.”
Posting, sharing, and engaging with others on social media can, therefore, become a habit.
“Our findings show that misinformation isn’t spread through a deficit of users. It’s really a function of the structure of the social media sites themselves,” said Wendy Wood, an expert on habits and USC emerita Provost Professor of psychology and business.
“The habits of social media users are a bigger driver of misinformation spread than individual attributes. We know from prior research that some people don’t process information critically, and others form opinions based on political biases, which also affects their ability to recognize false stories online,” said Gizem Ceylan, who led the study during her doctorate at USC Marshall and is now a postdoctoral researcher at the Yale School of Management. “However, we show that the reward structure of social media platforms plays a bigger role when it comes to misinformation spread.”
In a novel approach, Ceylan and her co-authors sought to understand how the reward structure of social media sites drives users to develop habits of posting misinformation on social media.
Why fake news spreads: behind the social network
Overall, the study involved 2,476 active Facebook users ranging in age from 18 to 89 who volunteered in response to online advertising to participate. They were compensated to complete a “decision-making” survey approximately seven minutes long.
Surprisingly, the researchers found that users’ social media habits doubled and, in some cases, tripled the amount of fake news they shared. Their habits were more influential in sharing fake news than other factors, including political beliefs and lack of critical reasoning.
Frequent, habitual users forwarded six times more fake news than occasional or new users.
“This type of behavior has been rewarded in the past by algorithms that prioritize engagement when selecting which posts users see in their news feed, and by the structure and design of the sites themselves,” said second author Ian A. Anderson, a behavioral scientist and doctoral candidate at USC Dornsife. “Understanding the dynamics behind misinformation spread is important given its political, health and social consequences.”
Experimenting with different scenarios to see why fake news spreads
In the first experiment, the researchers found that habitual users of social media share both true and fake news.
In another experiment, the researchers found that the habitual sharing of misinformation is part of a broader pattern of insensitivity to the information being shared. In fact, habitual users shared politically discordant news — news that challenged their political beliefs — as much as concordant news that they endorsed.
Lastly, the team tested whether social media reward structures could be devised to promote sharing of true over false information. They showed that incentives for accuracy rather than popularity (as is currently the case on social media sites) doubled the amount of accurate news that users share on social platforms.
The study’s conclusions:
- Habitual sharing of misinformation is not inevitable.
- Users could be incentivized to build sharing habits that make them more sensitive to sharing truthful content.
- Effectively reducing misinformation would require restructuring the online environments that promote and support its sharing.
These findings suggest that social media platforms can take a more active step than moderating what information is posted and instead pursue structural changes in their reward structure to limit the spread of misinformation.
Reference: “Sharing of misinformation is habitual, not just lazy or biased” by Gizem Ceylan, Ian A. Anderson and Wendy Wood, 17 January 2023, Proceedings of the National Academy of Sciences.
The study was funded by Yale University School of Management, USC Dornsife College of Letters, Arts, and Sciences Department of Psychology, and USC Marshall School of Business.
Discouraging the sharing of misinformation and, especially, fake news should be a high priority goal of social media platforms.
However, that doesn’t address something I consider to be equally troubling. There are, all too frequently, ‘stories’ from the main stream Media that I would characterize as propaganda, based on cherry picking data and stating things out of context. This is particularly common with respect to ‘gun control,’ where absolute numbers are commonly used instead of rates per 100,000. We are all familiar with stories about mass shootings. We are presented with the lurid details of a school shooting, and then told that such things are common. To reinforce the point we are presented with numbers. However, none of us have seen the stories on these other shootings. I would submit that it is because most of the other shootings are gang related, and there is less sympathy for young adults being shot, while engaged in illegal activities, than there is for actual children.
I would suggest that an important step in reducing fake news on social media is for the main stream Media to establish a reputation of trust such that when people read a claim on social media, they immediately ask themselves, “Why haven’t I seen this on the ‘trusted’ news sites?” Instead, people tend to characterize ‘news’ as being trustworthy if it is from a news source that the reader identifies as being supportive of their personal political viewpoint. What is needed is for everyone to trust all news sources equally. That is, objectivity in news is crucial to establishing trust.
“Heal thyself, physician!”
Luckily, we have Twitter leading the way in encouraging truthful news with their Community Notes system.
The expectation that mass media will be as balanced and analytical as government reports or scientific journals is unrealistic. Media wants reader engagement.
The U.S. still is number 1 in the world for shootings in schools; young adults or teenagers being killed in gang shootings also is an area where the U.S. is in the top 5.