Photo Credit: Google Images
Whether you realize it or not, if you have a Facebook account, then you have been part of a massive social experiment that has taken place over the past few years. As part of that social experiment, Facebook continues to make slight tweaks in the newsfeed algorithm in order to influence user behaviors.
Of course, the goal of this unofficial Facebook experiment was a worthy one – to show that connecting us to more and more of our friends could make us all wiser, more tolerant and happier. However, it’s now time to admit that the Facebook newsfeed has had exactly the opposite effect – it has made us more narrow-minded, less tolerant and unhappier.
Enter the Law of Unintended Consequences
How is that possible? Economists and social scientists have known for quite some time about the Law of Unintended Consequences. Simply stated, certain laws, regulations or social rules often turn out not to have the impact they are supposed to have. And, almost always, it is impossible to predict this in advance.
For example, lawmakers have found that simple rules like fishing and hunting quotas can actually lead to declining – not increasing – populations of fish and game. Rules on pollution emission can actually lead to more pollution, not less. The list goes on and on – and now it looks like Facebook has run smack-dab into this Law of Unintended Consequences.
Why optimizing for engagement doesn’t work
When it comes to the newsfeed algorithm, Facebook had to come up with a number of parameters that it wanted to optimize. Ultimately, it settled on “engagement” as the key parameter. This intuitively makes sense – Facebook wants people to use the social network as often as possible, so it looks for ways to boost your interactions. The way it controls for this is by optimizing for “engagement.”
So let’s see how that applies to your newsfeed. If your newsfeed constantly had updates from people you didn’t know very well, or if it always had content in it that you never clicked on, you would probably conclude that Facebook was irrelevant to your life. You might stop using it. So Facebook constantly looks for ways to show you what you want to see. This means content from your closest connections and the type of content that you will click on, like and interact with.
The Facebook feedback loop
But that’s where the Law of Unintended Consequences kicks in. This means that you are self-selecting for content that reinforces what you already believe in. This means that you are consolidating your ties to an “in group” or “inner circle” around you. And it means that Facebook has to start showing you more and more outrageous content – some would call it extreme – in order to get you to continue clicking. (In the social experiment analogy, you’re now the lab rat, furiously clicking on buttons to feed the addiction.)
And all of that behavior creates a very specific type of feedback loop. It creates a “filter bubble” where everyone is saying exactly what you think (Trump is evil! Clinton is corrupt! Bernie Sanders will save us!). And it creates the perfect environment for “fake news” and outrageous content to live and prosper on Facebook. If hundreds of your friends are clicking on something, then Facebook will show it to you, too.
At some point, though, Facebook has to take ownership of this problem. Not just by making little cosmetic changes here and there to the newsfeed algorithm – but by admitting that some social experiments need a do-over. It happens in the lab all the time, and now it needs to happen #IRL.