Photo Credit: pexels
Increasingly, Google is playing an important role in political elections. The Silicon Valley giant is controlling the political narratives heading into the election, taking steps to change the way people view different issues, and even going so far as to discourage people from exploring search results on their own. In some cases, Google is just flat-out giving you the answer it wants you to have, even if the answer is incorrect. That’s just plain dangerous for democracy.
The end of the Google search area
Recently, WIRED Magazine published a thought-provoking op-ed piece from an assistant professor at UNC-Chapel Hill. The piece (“Google Search Is Quietly Damaging Democracy”) made a compelling case that Google has almost imperceptibly shifted its focus away from pure exploratory search. Remember the good old days when you’d go to Google.com, enter in a brief query, and Google would present you with a rich selection of possible links to explore? The burden of proof would then be on you to find the right link, read the full article, and come to your own conclusions.
But now the era of pure search appears to be over. In an effort to streamline the search process, Google is taking steps to guide you to your final destination as soon as you start entering your query. Within a few keystrokes, Google is attempting to anticipate what you might be searching for. It is literally trying to complete your thoughts with its auto-complete feature. Then, whenever it can, it will simply give you the answer to your question at the top of the search results page.
This might sound like good news (“Wow! I don’t have to read through all those links?”), but there is plenty of room for error here. There’s an old saying in the tech world: “Garbage in, garbage out.” And that’s what appears to be happening in many cases with Google. If wrong or imprecise data is used to answer a question, Google really doesn’t care. In some cases, this means you might get the wrong date for an upcoming primary election, or wrong information about a certain piece of legislation. Google is simply going to specific sources, and grabbing what it can, as fast as it can.
Are bots telling us how to vote?
It gets even scarier than that, because it seems like Google is sometimes telling us how to think about certain political issues or certain politicians. By presenting certain links at the top of the search results, it could promote a certain point of view that Google wants you to have on a certain issue. If you had actually scrolled to the bottom of the page, you might have come to a different conclusion.
Algorithms, by themselves, are not dangerous. But it is the humans controlling the algorithms that are dangerous. If they decide that certain sources are “misinformation” or “propaganda,” you might never get to the truth of an issue, because Google could decide not to show you those results. And, since Google generally has a reputation as a trusted, reliable source, we would probably never question what it shows us. Realistically, how many times have you gone to the second page of search results? Deep down, you have the belief that Google is some sort of all-knowing source that will find you the right information.
The next big election
So, it’s not going too far to say that Google could actually tip the scales when it comes to the next big election. It could return incorrect results for a search query. It could confirm unsubstantiated claims by promoting certain links. And it could be manipulated by unscrupulous actors who “game” the algorithm in order to spread falsehoods about certain candidates.
We all know about “October Surprises” in American political life, and Google could be the final determining factor in how we react to such a surprise in 2022 or 2024. So, do your research. Don’t rely on Google to do all the heavy-lifting for you. And, whatever you do, get out there and vote this year!