Connect with us

News

Facebook took its Report Moderation System more seriously

Published

on

Facebook took its report moderation system more seriously

Facebook recently updated their content moderation queue system. It aims to bring significant improvements in tracing the worst-case reports. That might help to slow the spread of harmful or violent contents, based on automated detection. With its improvisation in the report moderation system, Facebook ensures its human moderators to get guidance towards such cases first. This new update helps in optimizing their workload as well, thus, making work faster.

The new process takes advantage of improved machine learning processes to differentiate reported content. The Verge explains it, “In the past, [Facebook’s] moderators reviewed posts more or less chronologically, dealing with them in the order they were reported.” It added, “Now, Facebook says it wants to make sure the most important posts are seen first and is using machine learning to help.” It further stated they are planning to use “an amalgam of various machine learning algorithms” to sort the queue. That will prioritize posts depending on three criteria: their severity, their virality, and the likelihood of breaking the rules.

Albeit there will be some difficulties for an automated system to arrange the correct order with 100% accuracy. Everyone still believes the situation would be better than it is now. The Facebook factoring in ‘virality’ leads to significant improvements. It considers the potential reach of the post, relying on the posting users’ history, following, etc.

The prioritization of harmful content became a serious issue after the ‘Plandemic’ conspiracy-theory video racked up 2 million views. The video came out in May and remained till July unless the company removed it. Facebook accepted that it “took longer than it should have” to disappear another COVID-19 conspiracy-laden video. The updated report moderation system can detect the content based on ‘severity,’ leading to significant benefits in addressing the worst contents.

Facebook recently claimed to undertake about 99.5% of its actions on graphic and violent content before users could report. It now assures that the blend of new updates and older detection systems will help them filter more accurately. That would have major benefits for user safety too.

Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending