Facebook is blocking more than 4,000 people and groups the company considers dangerous, including white supremacists, militarized social movements and alleged terrorists. The Intercept on Tuesday published a leaked list of dangerous individuals and organizations that Facebook does not allow on its platform, and provides an insight into how the social network moderates content that can lead to violence offline.
More than half of the list consists of alleged foreign terrorists, who are predominantly Middle Eastern, South Asian and Muslim. Experts told The Intercept that the list as well as Facebook’s policy suggest the company is imposing stricter restrictions on marginalized groups.
Facebook has a three-tier system that indicates what type of enforcement the company will take in relation to content. Terrorist groups, hate groups and criminal organizations are part of the most restrictive level, level 1. The least restrictive level, level 3, includes militarized social movements, as The Intercept said “are mostly right-wing US anti-government militias, which are largely all white. “
Brian Fishman, Facebook’s police director for counter-terrorism and dangerous organizations, said in a series of tweets that the version of the list published by The Intercept is not comprehensive. The list, he said, is constantly updated.
“Defining and identifying dangerous organizations globally is extremely difficult. There are no hard and fast definitions agreed upon,” he said. Fishman also pointed out that terrorist organizations such as the Islamic State group and al-Qaeda have hundreds of individual units, many of which are listed as separate items to “facilitate enforcement,” skewing the number of units from a particular region. The Tier 1 list, he said, includes more than 250 white supremacist organizations.
Facebook has faced pressure to be more transparent about its policies against dangerous individuals and organizations. In January, the supervisory board tasked with reviewing the content moderation of the social network overturned a decision to remove a post that the company had said violated that policy, noting that “the rules were not made clear enough to users.” The board also recommended that Facebook publish its list of dangerous organizations and individuals or cite examples.
Fishman said Facebook has not shared the list “to limit legal risk, limit security risks and minimize groups’ ability to circumvent rules,” but seeks to improve policy.