Photo Credit: flickr
In what is being hailed as a breakthrough move, Facebook recently opened up its immense data trove to a team of 17 academic researchers. The goal was simple: find out what is causing the spread of disinformation on the platform, and then run a few experiments to see what can be done to fix it. The researchers specifically focused on Facebook data leading up to the 2020 presidential election.
To some degree, Facebook is still fighting back against the idea that it played a significant role in the toxicity of the 2020 presidential campaign. Opening up the data to researchers was, admittedly, done with clearing up that issue. Looking ahead, Facebook doesn’t want to get pulled down in debates about disinformation and ideological polarization during the 2024 presidential campaign.
Key findings
One key finding, as might be imagined, was that “untrustworthy” news sources were overwhelmingly favored by political conservatives. Translation: if you voted for Donald Trump in 2020, you were probably basing your decision on news sources outside of the mainstream media. The subtext here, of course, is that Trump’s followers are either too stupid or too lazy to find out the truth, and were led down the wrong path by disinformation, misinformation, and just plain ol’ lies. In fact, say the researchers, 97% of all “false” information (as independently verified by third-party fact-checkers) was seen more by conservatives than by liberals.
So what was causing all this misinformation and disinformation to be spread amongst Trump backers? One factor, say the researchers, was the “echo chamber” effect. In other words, if all of your friends and followers are spreading the same information, then you are more likely to share and spread that information. It’s all very alarming, because it turns out it’s a lot harder to stop the spread of this information than originally thought. And, if enough people start thinking a certain way, it can impact the future of our democracy.
Key steps taken by Facebook
The good news, say that researchers, is that some of the steps taken by Facebook seem to be working. For example, one way to reduce the echo chamber effect is to make it a lot harder to share content. One way to do that, of course, is simply not to show polarizing content in the first place. So Facebook has been actively looking for ways to cut down on the amount of political content in its news feed.
Another key factor, say researchers, is the ability to share information within groups with similar political leanings. Facebook could reduce the amount of ideological segregation and polarization, they say, by somehow reducing the prominence of Pages and Groups. Or, at the very least, Facebook could cut down on the amount of content shared by people belonging to them. One way to do this is by changing how information is shown to users in the news feed. By using a strict chronological feed, it’s possible to reduce the “stickiness” of some sorts of disinformation.
The 3% problem
After crunching all the data, the team of 17 researchers determined that approximately 3% of all posts shared on the platform are political in nature. During the peak of an election season, one would expect that figure to trend still higher. That being said, the overwhelming number of people on Facebook are having discussions about what they ate for lunch or dinner, not about highly-polarizing political topics. So, when Facebook talks about the amount of political content being shared online, there’s probably not much more they can realistically do to move that 3% number even lower.
The good news is that Facebook can make positive changes to its platforms without resorting to outright censorship. Facebook shouldn’t try to silence certain groups entirely – it just needs to find certain ways to reduce the spread of disinformation in a way that is as democratic as possible. For example, changing the way news feeds are presented is one way to level the playing field for everyone without changing the actual content being shared on the platform.