Lord Ahmad at the UN meeting (Photo: Foreign & Commonwealth Office)
Ahmed J Versi
Prime Minister, Theresa May, told the United Nations General Assembly on September 20 that tech firms needed to develop the capacity to take down terrorist-related material in two hours.
However, tech and social media companies responded by saying that they already have countermeasures in place against the terrorists using their websites.
Brian Fishman, who manages Facebook’s global counter-terrorism policy insisted in an interview published a few hours later by the CTC Sentinel, the journal of the Combating Terrorism Center at the US Military Academy at West Point, that companies such as his were already putting great effort into this work.
May wants the internet firms to “develop new technological solutions to prevent such content being uploaded in the first place,” she said on the sidelines of the UN meeting with French President, Emmanuel Macron, Italian Prime Minister, Paolo Gentiloni, and tech companies Facebook, Microsoft and Twitter.
The Government is insisting that more needs to be done. In an exclusive interview with The Muslim News from the UN after meeting of preventing terrorist use of the internet, Foreign Office Minister, Lord Ahmad of Wimbledon, said Daesh “related extremist content was online for months before it got removed.” May said such material should be taken down in two hours. “The sooner we can remove the material the more effective it is,” said Lord Ahmad.
“Terrorist groups are aware that the links to their propaganda are being removed more quickly, and are placing a greater emphasis on disseminating content at speed in order to stay ahead,” May told the tech companies. She emphasised that the industry needs to go “further and faster in automating the detection and removal of terrorist content online and developing technological solutions which prevent it being uploaded in the first place.”
Lord Ahmad said the internet companies are already tackling extremists online. Twitter suspended 299,649 accounts between January 1 and June 30 this year, 75% of accounts were suspended before their first Tweet, he said.
Even if and after terrorist groups are closed down in the mainstream internet they are able to reactivate their accounts under a different name or find new life on the Dark Web. “That is why we need internet companies on board because they are the ones with technological expertise and insight into how to tackle this,” said Lord Ahmad.
Research by criminologists has shown that denying the extremists groups online platforms does not prevent their presence online. Attempts to shut down hate speech online “may cause a backlash, worsening the problem and making hate speech groups more attractive to marginalised and stigmatised groups.” (Can taking down websites really stop terrorists and hate groups? in conversation.com September 15).
Lord Ahmad argued that this “should not distract us from the fact that we as Government with the society and private sector from cutting off oxygen on online extremist narrative.” Even though Facebook claims it removes only extremist materials from its site, it has removed legitimate sites claiming they don’t follow “Facebook Community Standards.”
For example, Rohingya activists in Myanmar and Western countries say Facebook has been removing their posts documenting the ethnic cleansing of Rohingya people in Myanmar. Their accounts are frequently suspended or taken down. (Daily Beast September 18)
Lord Ahmad explained that “we have to balance what is quite legitimate and use of social media to highlight violations and highlight legitimate viewpoints but at the same time ensuring we balance that with security needs and ensuring people are not radicalised.”