I, Too, Am A Senior Trump Official. When I Challenge The President, I Do It Directly | Opinion

Facebook is creating an independent body to help it judge what content will be allowed on the platform.

The independent body will oversee content appeals, which is when you complain Facebook has wrongly taken down your post. Currently, the company's content moderators handle the process, but on Thursday Facebook CEO Mark Zuckerberg said it was time to incorporate outside experts.

"It will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons," he said in a Facebook post.

Mark Zuckerberg Facebook CEO>

The company is still deciding how the independent body will work. But it'll function like a Supreme Court for content appeals in that it will pick which cases to hear. Zuckerberg wants to pilot the project in different regions of the world in the first half of 2019, with the goal of fully establishing the independent body by year's end.

"Over time, I believe this body will play an important role in our overall governance. Just as our board of directors is accountable to our shareholders, this body would be focused only on our community," he added.

The independent body is forming at a time when Facebook is trying to balance online freedom of expression with fighting hate speech. In July, Facebook's CEO tried to defend the company's policies, amid calls the company was failing to ban conspiracy sites on the platform.

Facebook Appeals Process>

Zuckerberg also announced the news days after more than 70 civil society groups, including Human Rights Watch and the Electronic Frontier Foundation, called on Facebook to add more transparency to its content appeals process. Their main worry is that Facebook's systems can accidentally censor legitimate content, especially from disenfranchised groups.

In April, Facebook for the first time began giving users the right to appeal content takedowns on individual posts. But for now, you can only launch an appeal for content removed over nudity, hate speech, or graphic violence, which the groups say isn't enough.

"We know from years of research and documentation that human content moderators, as well as machine learning algorithms, are prone to error," the groups added in their letter to Facebook. "Even low error rates can result in millions of silenced users when operating at massive scale."

Related

On Thursday, Zuckerberg acknowledged that the accidental content takedowns were a problem. "Today, depending on the type of content, our review teams make the wrong call in more than 1 out of every 10 cases," he added.

To reduce the error rates, Zuckerberg said his company is hiring more human reviewers to handle "more nuanced" appeal cases. The social network is also improving its AI algorithms to get smarter at filtering out prohibited content.

In addition, Zuckerberg promised that the company is working to expand the appeal process to any content you've had taken down by Facebook. "We're also working to provide more transparency into how policies were either violated or not," he added.

Source : https://www.pcmag.com/news/364988/facebook-to-form-independent-body-to-oversee-content-appeals

Facebook to Form Independent Body to Oversee Content Appeals
Early Elections in Israel: It’s Not Just Hamas, Stupid
Man admits he stabbed his ex-girlfriend to death, injured man in her home
US grocery store shooting suspect charged with hate crimes
FDA limits on menthol and teen vaping are more than just smoke
Man gets 4-1/2 years for fencing stolen goods from New Haven store
Macron was wrong about populist nationalism
Cough, cough! Let's get dirty-smoke belching trucks off our roads
Avenatti is latest man to deny domestic violence. Stats show they can't all be innocent.
Election Day showed Texas is purplish red