Facebook has been forced to defend its policy on terrorist content after it was accused of being too slow to remove it.
Monika Bickert, the companyâ€™s Head of Global Product Policy, has insisted â€˜there is no place on Facebook for terrorists, terrorist propaganda or the praising of terrorâ€™.
It comes after a Change.org petition criticising the social network site amassed more than 135,000 signatures.
Title ‘Dear Facebook, thanks for the ‘Safety-Check,â€™ yet on fighting ISIS, you can do much better!â€™, it said Facebook was not quick enough to remove ‘sick jihadi accountsâ€™ that were posting messages of support for the Paris terrorists in the immediate aftermath of the attacks.
It says: ‘Every 5 minutes, ISIS bot accounts sent the statement claiming responsibility for the attacks, along with links to pro ISIS accounts with a bloody picture of the Bataclan massacre, as their header image. As for the messagesâ€™ content, they sneered at us, gave praise to their Â«brave lions Â» (the suicide bombers), kept threatening us & posting jihadi propaganda videos.â€™
The messages were eventually removed, yet the petitionâ€™s author, Julie Guilbault, said Facebook could have done so sooner.
She contrasted the siteâ€™s response to pornography â€“ which is detected & removed â€“with the fact that ‘when it comes to advocating terrorism & publishing decapitation videos: no worries, they enjoy a comfortable delay before the content or their account will be deletedâ€™.
But in response to the accusations on Tuesday, Miss Bickert posted: ‘We work aggressively to ensure that we do not have terrorists or terror groups using the site.â€™
She said Facebook relies on users to report terrorist content. It is then ‘reviewed by a highly trained global team with expertise in dozens of languages â€™which ‘prioritizes any terrorism-related reports for immediate reviewâ€™.
She added: ‘We remove anyone or any group who has a violent mission or who has engaged in acts of terrorism. We moreover remove any content that expresses support for these groups or their actions.â€™
Miss Bickert went on: ‘When a crisis happens anywhere in the world, we organize our employees and, if necessary, shift resources to ensure that we are able to respond quickly to any violating content on the site. For instance, in the wake of the recent attacks in Paris, we moreover reached out immediately to NGOs, media, & government officials, to obtain the latest information so that we were prepared to act quickly.â€™
But she pointed out that users sometimes share ‘upsetting contentâ€™ for satisfactory reasons, such as to promote awareness of an issue.
‘Many people in volatile regions are suffering unspeakable horrors that fall outside the reach of media cameras. Facebook provides these people a voice, & we want to protect that voice.â€™
The response, which can be read in full here, adds: ‘If Facebook blocked all upsetting content, we would inevitably block the media, charities & others from reporting on what is happening in the worldâ€¦ However, we remove any graphic images shared to promote or glorify violence orthe perpetrators of violence.â€™
Source: “Yahoo News”