YouTube now removes more clips of violence, porn before anyone sees them

youtube christian wiediger unsplash

YouTube couldn’t wait to tell people just how many videos of extremism and adult entertainment it removed from its platform late last year.

Yeah, talk about an odd brand of excitement.

Following its announcement to clean up its community earlier this year, YouTube revealed in its new Community Guidelines quarterly report that some 8-million videos were removed from the platform between October and December last year.

“The majority of these 8-million videos were mostly spam or people attempting to upload adult content — and represent a fraction of a percent of YouTube’s total views during this time period,” the company wrote.

Additionally, 6.7-million of those videos were flagged by machines, and 76% of those videos were removed before any YouTube customer could view it.

YouTube considers this a total triumph.

“For example, at the beginning of 2017, 8% of the videos flagged and removed for violent extremism were taken down with fewer than 10 views. We introduced machine learning flagging in June 2017,” it noted.

“Now more than half of the videos we remove for violent extremism have fewer than 10 views.”

This, all in all, is largely attributed to machine learning.

“Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies,” the company added.

Feature image: Christian Wiediger via Unsplash

Andy Walker, former editor
More

News

Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.