Elham Asaad Buaras
Facebook has apologised after a report it commissioned found the tech giant responsible in spreading anti-Muslim speech and rumours that may have led to violence against Muslims in Sri Lanka two years ago
The riots in early 2018 erupted as anti-Muslim anger was whipped up on social media, forcing the Sri Lankan Government to impose a state of emergency and block access to Facebook.
The tech giant commissioned a probe by Article One, a human rights consultancy into the part it may have played, and investigators said incendiary content on Facebook may have led to violence against Muslims.
According to the findings of the investigation, the hate speech and rumours spread on the platform ‘may have led to “offline” violence’.
“We deplore the misuse of our platform,” Facebook said in a statement to Bloomberg News after the findings were released on May 12. We recognise and apologise for, the very real human rights impacts that resulted.”
At least three people were killed, and 20 injured in the unrest, during which mosques and Muslim businesses were burned, mainly in the central part of the Sinhalese Buddhist-majority nation.
The anti-Muslim riots began in the town of Ampara on February 26, 2018, spreading to the Kandy District by March 2 until its end on March 10, 2018.
The consultants also suggested that before the unrest, Facebook had failed to take down such content, which resulted in hate speech and other forms of harassment remaining and even spreading on the platform. Article One said one civil society organisation had tried to engage with the company on the misuse of Facebook as far back as 2009.
In 2018, officials said mobs used Facebook to coordinate attacks, and that the platform had “only two resource persons” to review content in Sinhala, the language of Sri Lanka’s ethnic majority whose members were behind the violence.
Facebook has 4.4 million daily active users in Sri Lanka, according to the report by Article One.
The firm said it had taken several steps in the last two years to better protect human rights. “In Sri Lanka… we are reducing the distribution of frequently re-shared messages, which are often associated with click-bait and misinformation.”
Facebook said in a statement accompanying the report. It said it had also hired more staff, including Sinhala speakers, and started using detection technology to protect vulnerable groups.