Harun Nasrullah
A new report from the Centre for Countering Digital Hate (CCDH) claims that the largest social media platforms are failing to act on nearly 9 out of 10 anti-Muslim and Islamophobic posts on their sites.
The research, published on April 28, reported 530 posts on Facebook, Twitter, Instagram, YouTube, and TikTok viewed 25 million times, that contained dehumanising content about Muslims via racist caricatures, conspiracies, and false claims.
Kemi Badenoch, the UK’s Minister for Communities and Equalities called on social media companies “to do more to take meaningful action against all forms of hatred and abuse their users experience online.”
This included Instagram posts that depicted Muslims as pigs and called for their expulsion from Europe, comparisons between Islam and cancer that should be “treated with radiation” on a photo of an atomic blast, tweets on Twitter that claimed Muslim migration was part of a plot to change the politics of other countries, and many more.
The CCDH used offensive hashtags like #deathtoislam, #islamiscancer, and # raghead to identify posts to report.
The CCDH reported 125 posts to Facebook, with only seven acted on; 227 to Instagram, with only 32 acted on; 50 to TikTok, with 18 acted on; 105 to Twitter, with only three acted on; and 23 videos submitted to YouTube, none of which were reported on.
Facebook also hosted numerous groups dedicated to Islamophobia, with names such as “ISLAM means Terrorism”, “Stop Islamization of America”, and “Boycott Halal Certification in Australia”.
Many of these groups have thousands of members, with 361,922 members in total, predominantly in the UK, US, and Australia. Researchers also identified 20 posts featuring the Christchurch terrorist. Of these, just 6 were acted upon, despite Facebook, Instagram and Twitter making public commitments to remove terrorist and extremist content.
The shooter also published a 74-page manifesto that railed against Muslims and immigrants, which was quickly spread online.
At the time, Facebook said it removed 1.5 million videos showing the New Zealand mosque attacks in the first 24 hours following the mass shootings.
The video, which was streamed on Facebook, was originally viewed 4,000 times, with social media sites struggling to take down reuploaded footage.
Many of the uploaders made small modifications to the video, such as adding watermarks or logos to the footage or altering the size of the clips to defeat YouTube’s ability to detect and remove it.
(Report cover courtesy of the CCDH)