One of the useful features of search engines like Google is the autocomplete feature that allows users to quickly find answers to their questions or queries. However, autocomplete search functions are based on ambiguous algorithms that have been criticized for often: biased and racist results†
The ambiguity of this algorithm stems from the fact that most of us know very little about it — which has led some to refer to it as “black boxes† Search engines and social media platforms do not provide meaningful insight or details about the nature of the algorithms they use. As users, we have the right to know the criteria used to produce search results and how they are customized for individual users, including how people are tagged by Google’s search engine algorithms.
To do this, we can use a reverse engineering process, where we perform multiple online searches on a specific platform to better understand the rules in force. For example, the hashtag #fentanyl can currently be searched and used on Twitter, but it is not allowed to be used on Instagramstating the type of rules available on each platform.
When searching for celebrities with Google, there is often a short caption and thumbnail image associated with the person that is automatically generated by Google.
Our recent research showed how: Google’s search engine normalizes conspiracy theorists, hate figures and other controversial people by offering neutral and sometimes even positive subtitles† We used virtual private networks (VPNs) to hide our locations and hide our browsing history to make sure search results weren’t based on our geographic location or search history.
For example, we found that Alex Jones, “the most prolific conspiracy theorist in America today‘, is defined as an ‘American radio host’, while David Icke, who is also known for spreading conspiracies, is described as an ‘ex-footballer’. These terms are considered by Google to be the defining characteristics of these individuals and may mislead the public.
In the short time since our survey was conducted in the fall of 2021, search results appear to have changed.
I found that some of the subtitles we originally identified have been changed, removed or replaced. For example, Norwegian terrorist Anders Breivik was subtitled ‘Convicted Criminal’, but now there is no label attached to him.
believe goldy, the far-right Canadian white nationalist banned from Facebook for spreading hate speech, had no subtitle. Now, however, her new Google subtitle is “Canadian Commentator.”
There is no indication of what a commenter is suggesting. The same observation is found with regard to: American white supremacist Richard B. Spencer† Spencer didn’t have a label a few months ago, but is now an “American editor,” which is certainly not typical of his legacy.
Another change concerns Lauren Southern, a Canadian far-right memberwho was labeled a ‘Canadian activist’, a somewhat positive term, but is now described as a ‘Canadian author’.
The apparently random subtitle changes show that the programming of the algorithmic black boxes is not static, but changes based on a number of indicators as yet unknown to us.
Search in Arabic vs English
A second important new finding from our study relates to the differences in the subtitle results based on the selected language search. I speak and read Arabic, so I changed the language setting and searched for the same figures to understand how they are described in Arabic.
To my surprise, I discovered some major differences between English and Arabic. Again, there was nothing negative about describing some of the numbers I was looking for. Alex Jones becomes a “TV talk show host” and Lauren Southern is mistakenly described as a “politician.”
And there’s a lot more from the Arabic language searches: Faith Goldy becomes an “expert”, David Icke transforms from a “former football player” into an “author” and Jake Angeli, the “QAnon shaman‘ becomes an ‘actor’ in Arabic and an ‘American activist’ in English.
Richard B. Spencer becomes a “publisher” and Dan Bongino, a conspirator permanently banned from YouTube, transforms from an “American radio host” in English to a “politician” in Arabic. interesting, the far-right figure, Tommy Robinson, is described as a “British-English political activist” in English, but has no subtitle in Arabic.
What we can deduce from these language differences is that these descriptors are inadequate, because they condense one’s description into one or a few words that can be misleading.
It is important to understand how algorithms work, especially as disinformation and mistrust are on the rise and conspiracy theories are still spreading rapidly. We also need more insight into how Google and other search engines work – it’s important to hold these companies accountable for their biased and ambiguous algorithms.
This article by Ahmed Al-RawiAssistant Professor, News, Social Media and Public Communication, Simon Fraser Universityhas been reissued from The conversation under a Creative Commons license. Read the original article†