Elham Asaad Buaras
Google has altered auto-complete suggestions in its search engine after it was alerted to anti-Semitic and sexist entries but only corrected that anti-Muslim suggestion after being challenged by The Muslim News.
Google’s auto-complete feature suggests common searches after a user enters a few words into the search box. The auto-complete predictions are algorithmically generated based on common users’ search.
Previously typing the question ‘are Jews…’ or ‘are women…’ generated the auto-suggestion ‘evil’ and for ‘are Muslims…’ it auto-suggested ‘bad’. However, by December 5, the searches for Jews and women no longer returned those negative results but the ‘are Muslims bad’ auto-complete was left for a week.
Despite telling The Guardian they “took action within hours of being notified ” of the anti-Semitic and sexist auto-complete results, a spokesman for the web giant said they “cannot comment” when probed by The Muslim News on why it failed to remove the anti-Muslim auto-complete results at the same time.
This is not the first time Google auto-complete and search algorithms have caused offence.
In February Google was autocorrecting searches so that they read ‘Muslims support terrorism’. Google search tell users searching for ‘Muslims report terrorism’ that they might have been looking instead for ‘Muslims support terrorism’.
In 2014 Google took action after its search engine was found to be suggesting vile racist terms when users searched for a number of UK cities including Bradford, Leicester, and Birmingham.
The search engine was found to be making crude and offensive suggestions when users typed in ‘Why is.. ’ followed by the name of a city.
For example, if a user typed in the phrase ‘Why is Bradford… ’ the site automatically suggested the search ‘Why is Bradford so full of P****.’ Typing in the phrase with Leicester and Birmingham also produced similar results.