A report found that Facebook allowed various Neo-Nazi groups to remain on its platform, citing that they “do not violate community standards”, according to recent reporting from The Independent.
The Counter Extremism Project, a nonprofit combatting extremist groups, reported 35 pages to Facebook, according to The Independent. Although the company said it’d remove six of them, the other requests were met with this response:
“We looked over the page you reported, and though it doesn’t go against one of our specific community standards, we understand that the page or something shared on it may still be offensive to you and others.”
The groups reported included international white supremacist organizations, with many making racist or homophobic statements. Some groups also had images of Adolf Hitler and other fascist symbols.
Although this is particularly troublesome following the Christchurch shooting — which broadcasted on Facebook Live — this has been a long-standing issue for Facebook. The platform is notorious for allowing hate speech to flourish while poorly applying its own community standards.
At first glance, Facebook’s definition of “hate speech” seems fine. Under its guidelines, Facebook bans hate speech that “directly attacks people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, or gender identity, or, serious disabilities or diseases.”
However, Facebook ignores power imbalances in determining what’s hate speech.
On Facebook, for example, users can be banned for saying “men are trash”. The power imbalance between men and women in a patriarchal society tells you that, even if someone’s feelings might be hurt, saying “men are trash” doesn’t harm men on a societal level.
But, while it was banning users for “men are trash” commentary, Facebook took far longer to ban Alex Jones, the host of InfoWars who incited harassment and spread misinformation, as reported by Mic.
Along with Facebook ignoring power imbalances, it has also — intentionally or unintentionally — found a way to monetize hate.
Earlier this year, The Los Angeles Times reported that Facebook actually allowed advertisers to target users based on their interest in Nazis. Advertisers were able to hone in on topics like “Josef Mengele” and “Heinrich Himmler”.
By allowing advertisers to target people based off their interest in Nazism, Facebook essentially allowed a violent ideology — that has led to actual genocide — to become a method for profit. Doing so curbs any desire to take proactive action in order to tackle this type of violent speech that has led to consequences for oppressed people.
Facebook is under a lot of pressure now, especially from New Zealand’s Prime Minister Jacinda Ardern, who has remained unimpressed by the company’s responses to its broadcasting of the Christchurch shooting.
By allowing Neo-Nazis and other hate groups to remain on its site and even allowing them to use their dollars to pay for advertisement, Facebook laid the online seed that allowed things like the broadcast of the Christchurch shooting to happen on its platform.
Nothing occurs in a vacuum. Increasing Islamophobic rhetoric from all major political parties made Muslims an easy target. But, it’s online platforms like Facebook saying Neo-Nazis don’t violate community standards that helps to embolden their actions.