Independent body to moderate content at Facebook

The first note published by Zuckerberg in September covered how Facebook was combating the election interference. It was continuously been said that how the platform wants to define the content that is and is not allowed on Facebook.

It was a constant struggle for Facebook to regulate the content with both human reviewers and the artificial intelligence. But one of the most significant announcements recently has been the creation of an independent body to review the content decisions.

In the words of Mark Zuckerberg, the plan is to create a new way for people to appeal the content decisions to an independent body. The decisions of that independent body would be transparent and binding and it will uphold the principle giving people a voice along with recognizing the reality of keeping people safe.

Also, according to the social media platform’s CEO, the independent body will play an important role in its overall governance and will be focused only on the community.

The post appeared when earlier this week when The New York Times suggested Facebook was aware of a Russian campaign that was designed to influence the 2016 US presidential election as early as the spring of 2016. The claims, however, were denied by Facebook.

The CEO reported that the team involved in its policy process and enforcement of those policies covered around 30,000 people and it also included the content reviewers. In total, more than two million pieces of content is reviewed every day by the team of 200 people working on counter-terrorism specifically.

Facebook has admitted that it was too slow to address the issues it was grappling with but the company has now taken a strong initiative to eradicate the rot that it allowed in the first place.