Social media makes people more narrow-minded, study finds

Ideally an outlet for people with different viewpoints to interact and cultivate understanding, Facebook and other forms of social media could actually make us more isolated and create an “echo chamber” of sorts that confirms our existing biases, according to a new study.

Writing in the Proceedings of the National Academy of Sciences, Alessandro Bessi, a researcher at the University of Southern California’s Information Science Institute, and colleagues reported that social media users “mostly tend to select and share content related to a specific narrative and to ignore the rest. In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogeneous, polarized clusters.”

These homogeneous clusters or “echo chambers,” they said, are primarily the result of “selective exposure to content” and in most instances, the information “is taken by a friend having the same profile (polarization) – i.e., belonging to the same echo chamber.” To put it another way, you and all of your friends tend to share the same information, even if it’s fake news, because you all tend to think alike and do not encourage discussion of ideas that run counter to your viewpoints.

cell phone twitter

“Echo chambers” on social media reinforce viewpoints (Credit: Freestock.com/Unsplash.com)

Bessi’s team came to their conclusion after mapping the spread of two different types of content: scientific information and conspiracy theories, according to CNN. The study authors emphasized that they did not focus on the quality of the information being shared, but whether or not it could be easily verified and based on identifiable scientific data, methods, and outcomes.

“Our analysis showed that two well-shaped, highly segregated, and mostly non-interacting communities exist around scientific and conspiracy-like topics,” Bessi told CNN. “Users show a tendency to search for, interpret, and recall information that confirms their pre-existing beliefs… Indeed, we found that conspiracy-like claims spread only inside the echo chambers of users that usually support alternative sources of information and distrust official and mainstream news.”

Using algorithms to combat ‘fake news’ may not be the answer

Bessi explained that he and his fellow researchers became interested in the subject after seeing that the spread of digital information was called one of the main threats to modern society by the World Economic Forum, driving Facebook, Google and other online outlets to find a way to stop the spread of so-called “fake news” without infringing upon the free exchange of ideas.

Everyone is subject to some form of confirmation bias, even if they pride themselves on being open-minded and accepting of other people’s viewpoints, Bessi told CNN. “If we see something that confirms our ideas,” he said, “we are prone to like and share it. Moreover, we have limited cognitive resources, limited attention, and a limited amount of time… I may share a content just because it has been published by a friend that I trust and whose opinions are close to mine.”

That can be dangerous, he warned, and while there may someday be programs or algorithms that can adequately slow down the spread of such misinformation, for now, the researchers encourage social media users to do their own critical evaluations and fact-checking before sharing.

“Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to the proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia,” they wrote.

—–

Image credit: Thinkstock