Social Media Failing to Identify and Remove Extremism


social media on smartphonesThe Counter Extremism Project (CEP) has issued a report, The Extreme Right on Facebook, asserting that the social media giant is failing to enforce its own Community Standards for allowable content by failing to remove pages representing extremist groups.

The Facebook Community Standards state that the site does not allow content “promoting or publicizing violent crime”; does not allow groups who engage in terrorist activities or organized hate; and does not allow pages to “express support or praise for groups, leaders, or individuals involved in these activities.”

The CEP identified and monitored 40 Facebook pages belonging to neo-nazi and white supremacists groups for two months. In that time, five of the sites were removed from Facebook, and the rest were reported by CEP to Facebook, resulting in the removal of only four of the sites.

The CEP contends that Facebook only monitors sites in reaction to complaints of extremism and hate speech, does not have an adequate site reporting system, and is not proactive enough in preventing extremist group from operating sites on its platform. The CEP believes that “more must be done to stop extremist radicalization” in social media and calls for Facebook to improve its reporting features and analyst training so that questionable pages violating Community Standards are identified and removed more readily.

As long as extremist groups continue to produce compelling propaganda that plays a part in inspiring and inciting individuals to violence––and remains easily accessible online––terrorism in the name of these extremist groups will remain a threat worldwide.

More resources on extremism can be found at the Homeland Security Digital Library (HSDL).

 

Need help finding something?  Ask one of our librarians for assistance!


Note: you may need to login to the HSDL to view some resources mentioned in the blog.

Need help finding something?  Ask our librarians for assistance!

Scroll to Top