Here’s how Facebook plans to make final decisions about controversial content it’s taken down

Here’s how Facebook plans to make final decisions about controversial content it’s taken down

Facebook announced new details on its independent oversight board today that will make decisions about what kind of content can and can’t remain on the platform. | Aurelien Meunier/Getty Images

The company announced new details Tuesday about how what’s been called its “Supreme Court” for content moderation will work.

For years, Facebook has had a content moderation problem. It’s struggled to make unpopular decisions about content — like whether it should take down a viral doctored video of House Speaker Nancy Pelosi that made it appear like she was slurring her words (it didn’t) or if it should ban prominent conspiracy theorist Alex Jones (it did). And as the 2020 presidential election gets closer, the company is attempting to do a better job than it did in 2016, when it was accused of allowing misinformation on its platform to influence the democratic process. That’s why Facebook will soon outsource some of its toughest content moderation problems to a new, much-anticipated but not yet fully formed outside group: its independent oversight board.

Facebook on Tuesday revealed new details about the board, which will evaluate how the company handles controversial posts on its platform, and which could influence its takedown policies around contentious topics like hate speech, nudity, and misinformation. Eventually, the board could also have the power to overrule Facebook’s controversial ad policy that allows politicians to make false statements in political ads. The policy has lately been the source of heated criticism for the company, including for allowing a false ad about former Vice President Joe Biden to remain on the site. The new board could potentially reverse these kinds of decisions one day. But we have very few details for now on exactly when or how that would happen, and Facebook says it’s not in the initial scope of the first set of cases the board will take on.

Currently, if you object to Facebook’s decision to take down content you posted, you can appeal the decision to Facebook’s community moderation team. But after that, there’s no way for you to further dispute the decision. In the future, the company says, Facebook and Instagram users will be able to petition the independent oversight board that will have final say on the matter, if it decides to take on a user’s case. People who care about the future of free speech online have been closely watching to see what Facebook does with its board, because it could shape the future of what’s allowed, or not allowed, on the platform for years to come.

And now, more than a year after CEO Mark Zuckerberg initially announced the idea of an oversight board, Facebook has released a series of proposed rules about how it will all work. That includes a process for users to submit appeals, an outline of how the board will decide on cases, and a mandate forcing Facebook to implement the board’s decisions on individual posts within seven days, unless the post violates local law. All in all, once the board considers an appeal, the decision should take around 90 days, Facebook says. In certain cases that warrant faster action, Facebook can expedite them, and the board should take no longer than 30 days to come to a decision.

We still don’t know arguably the most important information about the board — who will be on it — but Facebook expects to announce its first three co-chairs in the coming months. The company’s director of governance and strategic initiatives, Brent Harris, said on a press call on Tuesday morning that the board’s members will have a “diverse set of opinions.”

Harris called the board a “new mechanism that is independent, beyond the walls of Silicon Valley, and enables review against Facebook’s stated principles.” If the board chooses, it can amend any of the bylaws that Facebook proposed Tuesday, which it says it suggested for the sake of getting the board up and running.

Several experts Recode spoke with called the updates a step in the right direction for greater transparency, but say that the project’s success will depend on how much the company actually listens to this new governing body. Under the proposed rules, Facebook will be forced to follow the board’s decisions when it rules that the company should not have taken down content. But for broader policy decisions, Facebook will only take guidance — not mandates — from the board.

Ultimately, how the board ends up functioning could affect the day-to-day communications of Facebook’s more than 2 billion users around the world. It could turn out to be a success — or a failed experiment in wrangling the social network’s gargantuan problem of dealing with controversial speech.

“Content moderation is really hard — and these are really consequential decisions about public discourse and human rights,” said Evelyn Douek, a doctoral student at Harvard Law School researching the regulation of online speech. Douek, along with many other academics, gave Facebook early feedback on a draft of its new rules. “While it’s unacceptable to be having completely unaccountable and private entities making decisions, we also don’t want governments having their hands all over it. … So the oversight board is a new model for handling this.”

A brief history of Facebook’s oversight board

Zuckerberg announced the idea for an oversight board in 2018, as the company was facing intense criticism about how it moderates content from both liberal and conservative politicians in the US. At the time, Zuckerberg wrote that the company “should not make so many important decisions about free expression and safety” on its own, and that instead it would assign responsibility on some of the toughest decisions to an independent body.

So far, setting up the board has been a relatively slow process for a company whose famous tagline used to be “move fast and break things.” Facebook says that’s because it wants to be sure it’s getting the setup right.

What we do know is that the board will be funded by a separate trust, in which Facebook plans to invest more than $130 million. On Tuesday, Facebook also announced that it hired the board’s administrative director, Thomas Hughes, a former executive director for Article 19, a nonprofit for freedom of expression and digital rights. Hughes will oversee administrative staff but will not make content decisions.

The board plans to hear its first cases in the first half of 2020, in time for the presidential elections. So presumably, by that timeline, it will have to hire its members by July.

Remaining questions

The most obvious question about the board following Tuesday’s announcement is who will be on it.

Beyond that, experts wonder how many cases the board will take at each time, exactly what kinds of cases it will consider, and how broadly its rulings will be applied within the company. Facebook’s Harris said on a press call Monday that he expects the board will take on “dozens” of cases initially — which would obviously be only a very small percentage of the total volume of posts of Facebook.

“We don’t know what kinds of cases we’ll hear — the board might spend all of its time on hate speech or nudity,” said Douek. “We don’t know what the board will actually decide to do with its power.” She said it’s actually a good thing for the board to have discretion, if it’s going to be a truly independent body from Facebook as it hopes to be.

Douek also pointed out that for now, Facebook will be limiting users’ appeals to content they think was incorrectly taken off the platform. That leaves out complaints where users are upset that content remains on the platform (like hate speech, violence, or misinformation). In the future, Facebook plans to expand its appeal process to let users dispute whether a post should remain on the app, but it hasn’t said when.

Another big question is how narrowly Facebook plans to interpret the board’s decisions.

If Facebook only plans to take down, or leave up, a handful of posts that the board specifically rules on, it could make only a minor dent in the overall landscape of misinformation on the social media platform, some experts say. Dipayan Ghosh, who conducts economic and technology policy research at the Harvard Kennedy School, said that “the devil will really be in the details” of implementation, and that for now, these announcements help contain the criticism from the community of academics, policy experts, and regulators who are plugged into speech issues on Facebook.

“If Facebook didn’t do this, there would be more calls and pressure on the company to do something,” said Ghosh. “The announcement and gradual development of this board is coming at a time when the company needs it to continue to succeed and survive.”

Author: Shirin Ghaffary

Read More

RSS
Follow by Email