Facebook in spotlight again following US shooting

Facebook

Facebook’s moderation policies are once again in the spotlight after a shooting at a protest in Kenosha, Wisconsin.

Earlier this month, Facebook expanded its definition of dangerous groups to include militia groups promoting violence.

However, when the Kenosha Guard launched an event urging residents to “take up arms” against protesters, Facebook did not act, The Verge reports.

The publication confirmed that the page had been reported at least twice for the event. But Facebook responded to reports saying that it did not violate their community guidelines.

Following the shooting, which killed two people and injured another, Facebook took action.

“It wasn’t until Wednesday morning, more than nine hours after the shooting took place, that Kenosha Guard was cited by Facebook as violating the Dangerous Individuals and Organizations policy and removed,” The Verge notes.

The shooter has not been definitively linked to the Facebook event. But the 17-year-old suspect Kyle Rittenhouse echoed the event’s aim.

The New York Times notes that Rittenhouse said he was there to protect businesses during an interview with the The Daily Caller before the shooting. He can be seen armed with an automatic rifle in the interview.

Rittenhouse does not live in Kenosha, but travelled to the site of the protests, according to reports.

Protests started earlier this week due to the police shooting of Jacob Blake. A police officer shot Blake shot seven times in the back in front of his children.

Facebook responds to militia reports inaction

In response to questions about the dismissed reports, Facebook said that reports had not reached their specialised team.

A spokesperson told Business Insider that the reports didn’t reach their team dedicated to moderating militia-related content.

“This work is done by our specialised team whose primary role is to enforce our dangerous organisations policy and who specifically enforces this new policy update,” the spokesperson told the publication.

“This team continues studying terminology and symbolism used by these organisations to identify the language used that indicates violence so we can take action accordingly.”

History of lack of moderation and violence

Facebook has not found a link between the Facebook event and Rittenhouse’s Facebook and Instagram profiles.  But he and other armed individuals appeared at the protests with the same aim — to take up arms and protect property from protesters.

Events leading up to the shooting are still under investigation. However, the social media platform now views the shooting as a mass murder.

The incident shines a spotlight once again on issues surrounding Facebook’s moderation of content that violates its community guidelines. Even when they roll out policies specifically aimed at addressing these violations.

In 2018, Facebook admitted that its platform was used to spread hate speech and incite genocide in Myanmar. Their statement followed an independent human rights impact assessment on Facebook’s role in the violence.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” Facebook said in a statement at the time.

In July, Facebook experienced an advertising boycott that brought attention to the lack of moderation of hate speech on the platform.

In recent weeks, even the company’s own employees have questioned Facebook’s moderation policy.

Critics accuse Facebook of also ignoring anti-Muslim hate speech by politicians linked to India’s ruling party.

Feature image: Facebook

Megan Ellis
More

News

Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.