YouTube’s machines are better at fighting terrorism than humans

YouTube Logo

YouTube this week took to its blog, updating the internet on how it has been combating extremist content on its platform this past month.

Part of this strategy — outlined at the end of June — included partnering with the likes of Microsoft, Facebook and Twitter to accelerate efforts to nullify online extremism.

No ad to show here.

Beyond this, the company has also implemented a few hardware and policy changes.

“We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube,” it writes.

“We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way.”

Said machine learning technology removed over 75% of violent extremism videos in the past month before being flagged by humans.

In addition to speed, YouTube also suggests that the system’s accuracy has improved.

Machines are now more accurate at flagging videos than the humans who work at YouTube

“While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed,” it adds.

Finally, it also claims that its machine learning improvements have “more than doubled both the number of videos we’ve removed for violent extremism, as well as the rate at which we’ve taken this kind of content down”.

YouTube’s plans to tackle online extremism isn’t all rooted in machines though.

It reaffirmed that more people are needed to enforce policy changes.

“We are also hiring more people to help review and enforce our policies,” it writes, also suggesting that it has “begun working with more than 15 additional expert NGOs and institutions”.

Additionally, the company is planning more stringent screening processes.

“We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism,” the company notes.

“If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetised, and won’t have key features including comments, suggested videos, and likes.”

This will be coming to desktops in the coming weeks, with mobile versions of YouTube to follow suit at a later date.

No ad to show here.

More

News

Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.

Exit mobile version