How Facebook guides moderators on terrorist content
Warning! Some readers may find this content disturbing.
Terrorism is a priority topic for Facebook, and it has invested in software to try to stop some extremist content ever getting on the site. This proactive screening has cut the number of offensive posts. But moderators have told the Guardian a vast amount of terrorist-related content still gets through, and that terrorists have found ways of bypassing the strict rules.
The files seen by the Guardian show how moderators need to look at the captions as well as the images themselves. Often it is the words rather than the picture that will lead to a post being removed. We cannot show the most graphic images used to train moderators – but one shows a man shot in the head, lying in a pool of blood. This can be posted, as long as the caption with it is condemning, rather than celebratory.
Posts in support of a terrorist attack such as the one in Nice must be deleted, along with ‘non-explicit’ praise in the week after it has happened.