This research project has developed an evidence-based approach for classifying violent extremist content that appears on Facebook. Its aim is to feed into Facebook’s broader policy work around content review and assist in its ability to forecast when intervention or escalation may be necessary.
Facebook’s Community Standards state:
“We are committed to removing content that encourages real-world harm, including (but not limited to) physical, financial and emotional injury. [Furthermore,] we do not allow content that praises any of the above organisations or individuals or any acts committed by them.”
As such, Facebook tries to remove all materials produced by proscribed violent extremist organisations, unless they are being disseminated in very specific contexts (i.e., by journalists or academics). In recent years, these efforts have become highly effective, with almost all terrorist content being removed from the platform before it has even been reported.
However, not all terrorist content is created equal. The removal of some materials needs to be prioritised over the removal of others. Because Facebook is committed to using technology to detect harmful content but human reviewers to make the final decision as to what happens to it, it faces a serious challenge regarding how, when, and what to prioritise for review. Somehow, a distinction needs to be made between the materials that need fast-tracking for review and the materials that can be queued.
This project developed a way to help Facebook make that distinction in a timely and accurate fashion. Using the Islamic State as an initial test-case—but with a view to future, cross-ideological applications—it will nuance how harmful content is classified. Its basic premise is that it is possible to codify the intent of harmful content by studying, and then testing, the logic behind its production. If intent can be identified—i.e., if a clear distinction can be drawn between tactical, action-based content and strategic, brand-based content—then it will be possible to better prioritise according to risk posed.
Recent Policy Reports
- ICSR Team
ICSR Report Launch – Behind the Mask: Uncovering the Extremist Messages of a 3D-Printed Gun Designer, October 18thRead more
Where: King’s College London When: Wednesday October 18th, 6.15pm BST This new ICSR report reveals the digital…
- ICSR TeamRead more
Where: King’s College London When: Monday October 9th, 7.15pm BST ICSR invites you to attend a…
- ICSR Team
XCEPT Briefing Note – Imprisoned for Terrorism: The Experiences of Inmates in Roumieh Prison in LebanonRead more
XCEPT Briefing Note – Imprisoned for Terrorism: The Experiences of Inmates in Roumieh Prison in…