New Policy Spurred by Live Feed of Christ Church Shootings

In what Facebook reported was months in the making, the social media giant announced March 27 it will crack down on white supremacy groups and hate speech.

“It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services,” the company said in a statement.

Facebook said the new policy will include Instagram. As a matter of community standards, this change goes even further than the previous ban on white supremacist groups.

Louise Matzakas reported in that last May 2018, after someone leaked training documents which explained Facebook policies regarding the white supremacist, white nationalist and white separatists. The white supremacists were banned over race-driven hatred, while Facebook said the other two categories were less threatening. This did not sit well with civil rights supporters.

Almost all tech companies are facing increased pressure to curb the spread of white supremacist content on their platforms after they struggled to stop the Christchurch shooter’s live-streamed video from going viral. While online platforms have poured resources into stopping terrorist groups like ISIS and Al Qaeda from using their sites, historically, social media sites have treated white supremacist groups differently.

In a statement, Rashad Robinson, president of the civil rights group Color of Change, said Facebook’s move should encourage other platforms to “act urgently to stem the growth of white nationalist ideologies.”

While Twitter, YouTube, and other platforms have long had a nonviolence policy, Facebook appears to be the first to determine that these three groups are all hate-driven and that they have negatively influenced the opinions of internet users⏤especially young people searching these hate group sites.

The new policy starts this Wednesday for U.S. users who try to post or search for white nationalist or separatist content. Instead, Facebook has devised a pop-up directing them to the website for the organization Life After Hate, a nonprofit founded in 2011. This is similar to a Google tactic from 2016  that combated ad content related to ISIS and showed videos debunking the group. Life After Hate, founded by former extremists, provides information, videos and support to get people to reconsider their search into hate ideologies online. In a statement, the group said:

“Online radicalization is a process, not an outcome. Our goal is to insert ourselves in that continuum, so that our voice is there for people to consider as they explore extremist ideologies online”