This research project has developed an evidence-based approach for classifying violent extremist content that appears on Facebook. Its aim is to feed into Facebook’s broader policy work around content review and assist in its ability to forecast when intervention or escalation may be necessary.
Facebook’s Community Standards state:
“We are committed to removing content that encourages real-world harm, including (but not limited to) physical, financial and emotional injury. [Furthermore,] we do not allow content that praises any of the above organisations or individuals or any acts committed by them.”
As such, Facebook tries to remove all materials produced by proscribed violent extremist organisations, unless they are being disseminated in very specific contexts (i.e., by journalists or academics). In recent years, these efforts have become highly effective, with almost all terrorist content being removed from the platform before it has even been reported.
However, not all terrorist content is created equal. The removal of some materials needs to be prioritised over the removal of others. Because Facebook is committed to using technology to detect harmful content but human reviewers to make the final decision as to what happens to it, it faces a serious challenge regarding how, when, and what to prioritise for review. Somehow, a distinction needs to be made between the materials that need fast-tracking for review and the materials that can be queued.
This project developed a way to help Facebook make that distinction in a timely and accurate fashion. Using the Islamic State as an initial test-case—but with a view to future, cross-ideological applications—it will nuance how harmful content is classified. Its basic premise is that it is possible to codify the intent of harmful content by studying, and then testing, the logic behind its production. If intent can be identified—i.e., if a clear distinction can be drawn between tactical, action-based content and strategic, brand-based content—then it will be possible to better prioritise according to risk posed.
Recent Policy Reports
- ICSR Team
Literature Review: Innovation, Creativity and the Interplay Between Far‑right and Islamist ExtremismRead more
The full report can be accessed here. Please read on for the Introduction. As intercommunal…
- ICSR TeamRead more
Where: Online When: Monday March 15th, 5pm GMT / 1pm EDT ICSR invites you to attend an…
- ICSR TeamRead more
Where: Online When: Tuesday March 9th, 5.30pm GMT / 12.30pm ET Today, thousands of Islamic State (IS)…