After rolling out direct payments that bypass the app stores, both Google and Apple have removed Fortnite Mobile from Google Play and the App…
Facebook on Monday announced its intention to regulate live videos posted across its platforms.
This comes after a terrorist attack in Christchurch, New Zealand in March was streamed live via Facebook.
“In the wake of the terrorist attack, we are taking three steps: strengthening the rules for using Facebook Live, taking further steps to address hate on our platforms, and supporting the New Zealand community,” the company said in a blog post.
According to Facebook, it will be “exploring” ways to regulate and restrict live videos and “investing” in better technology to monitor activity.
Though the company removed the original video posted by the terrorist soon after the event, it had to process the copies edited and spread by others.
“In the past week, we have also made changes to our review process to help us improve our response time to videos like this in the future,” the company noted further.
Last week, Facebook also revealed that it will ban white nationalist and supremacist content.
Details of the coming changes were not specified.
“We are deeply committed to strengthening our policies, improving our technology and working with experts to keep Facebook safe,” the social network concluded.
Feature image: Memeburn