Facebook to use AI, ML to stop the spread of revenge porn on its platform

Facebook Ad transparency

Facebook will be implementing new detection technology to help curb the spread of revenge porn, it announced on Friday.

“To protect victims, it’s long been our policy to remove non-consensual intimate images (sometimes referred to as revenge porn) when they’re reported to us,” the company said on its blog.

No ad to show here.

The new detection methods will use artificial intelligence (AI) and machine learning (ML) to reach this new level of security.

“This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” the company explained.

Detected content will then be reviewed by a “specially-trained member” of Facebook’s Community Operations, who will then decide if the photos or videos should be removed.

The company is also launching a support group for victims called Not Without My Consent.

“Here victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further,” Facebook said.

It does certainly seem like the company is striving hard for a more ‘privacy focused’ platform.

Feature image: Facebook

No ad to show here.

More

News

Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.

Exit mobile version