Photo Credit: shutterstock
In the aftermath of the hotly contested 2020 presidential election, Facebook came under enormous scrutiny. Academics, pundits, and politicians blamed the social media platform for sharing dangerous disinformation, and for enabling highly polarized and ideological content to go viral. From their perspective, Facebook played a very integral role in letting things get out of hand back in 2020.
But now comes a bit of revisionist history, if you will. Facebook partnered with a group of academics and scholars, who recently published a full-length academic study in the journal Science on the impact of resharing political content on Facebook. They divided participants into two different groups: a control group, which had no changes made to their Facebook news feed; and an experimental group, which were shown absolutely no reshared content of any kind for three months. Some of the findings might surprise you.
Results of the study
As might be expected, the elimination of reshared and reposted content immediately meant a reduction in the amount of political news showing up in the news feed. It also meant a reduction in the amount of news appearing from “untrustworthy” sources, as well as a reduction in the number of clicks on “partisan” and highly ideological news sources. It also resulted in a reduction in the amount of overall engagement, as measured by clicks, reactions, and reshares. And, finally, it resulted in a sharp reduction in the amount of “news knowledge.”
All of this makes perfect sense. Reshared and reposted content comprises a lot of the content out there, so if you remove this content from a news feed, you’d expect to see less political news and less news from biased or untrustworthy sources. And you’d probably expect people to be a bit less knowledgeable about what’s going on in the political arena. Instead of clicking on hard-hitting posts about immigration policy and the Mexican border wall, they’re clicking on posts about what their aunt or uncle had for dinner last night.
But here’s what is not intuitive or obvious: the researchers found that “contrary to expectations,” reshared content does not significantly affect the amount of political polarization or the political views held by individuals. Most likely, this is because most people are living in an echo chamber these days, and only want to see content that agrees with their views. Thus, if you already believe strongly about a certain topic, simply seeing more on this topic isn’t going to have a huge impact. If anything, it will only harden your views.
What does it all mean?
The first thing you have to keep in mind is that Facebook’s parent company Meta paid for all of this research. It says so, right at the bottom of the Science article. Maybe this sounds too cynical, but you immediately know that the results are somehow going to be favorable for Facebook. The results of the research seem to show that reposted content really doesn’t matter as much as people think it might. The research article is very detailed, but the point is made at the end: no, this content does not change a person’s views.
So how does that benefit Facebook? Well, it does exculpate, to a certain degree, Facebook from all of the allegations that it somehow was responsible for all the polarization in the 2020 election. The research study says it all: reposted and reshared content does not significantly affect political polarization. And seeing disinformation reposted in your feed won’t change your political views.
The results of the study seem to say that simply amplifying the news via reposts and reshares isn’t as harmful as originally thought. Let’s just hope it’s also the case for the upcoming 2024 presidential election, because things are already starting to look really nasty, and we have a year yet to go.