As the ugly headlines keep coming for Facebook ( FB ), Mark Zuckerberg has proposed a new way to repair user trust: Let somebody else be the final judge of what gets to stay on the social network he founded.
Zuckerberg wrote in a lengthy Nov. 15 post that Facebook will create its own Supreme Court, “an independent body, whose decisions would be transparent and binding,” to judge company decisions about postings that allegedly violate its rules.
A subsequent month of bad news for the social network—made worse by two reports prepared for the Senate Select Committee on Intelligence about the success of Russian social media disinformation campaigns—has only made this goal of accountability more important.
It’s also offered new reminders of the difficulty Facebook will face in ensuring such an appeals process will work both fairly and at scale.
If it works, it could provide a useful example for other web platforms. If not? People will have yet another reason to give the network less of their time and attention.
Why Facebook needs this
The two new reports for the Senate Intelligence Committee, one from the research firm New Knowledge and the other from the University of Oxford’s Computational Propaganda Project , document how Facebook, as well as Google ( GOOG , GOOGL ) and Twitter ( TWTR ), ignored obvious signs of bad behavior during and after the 2016 elections.
The conclusion: Facebook needs to police its network better. But do you trust it to do that job fairly?
Only weeks ago, the company provided a near-perfect demonstration of the need for an appeals system when it took down a post by former Facebook employee Mark Luckie accusing the company of having a “black people problem.” Facebook then restored the post without a real explanation for either action.
“Further proves my point,” former manager Mark Luckie tweeted Dec. 4 about the suspension and then restoration of a post that Facebook first said violated its community standards, then reversed course and said it did not.
Luckie—a veteran of Twitter as well as Facebook who has called out racial hang ups across the tech industry—subsequently tweeted that Facebook hadn’t alerted him to either the takedown or its reversal.
He declined to discuss this further in an email, saying, “I’m stepping back from commenting on Facebook.”
My recent report about sketchy Facebook pages revealed another case of shifting standards. A reader pointed out four pages apparently run by the same operators of two political pages that Facebook had removed for pushing visitors to ad-saturated third-party sites.
The four new sites linked to either the same sites that had previously be taken down or lookalikes. But Facebook spokesperson Devon Kearns said in an email that they were fine. “The pages you sent do not violate our policies.”
Justice wound up being served, sort of, because the operators of the new pages eventually removed them on their own.
Facebook’s trust problem
“Having the least trusted company in this entire space policing itself is probably not the wisest direction to go,” said Bill Galston, a senior fellow at the Brookings Institution and co-chair of the New Center. “There should be some sort of external appeals mechanism.”
The New Center, a group Galston founded with Weekly Standard founder and editor Bill Kristol, put out a policy paper in late November urging just that. The paper says users should be able to appeal a content takedown to “a human or panel of humans who weren’t involved in the original decision.”
Skepticism about such an appeals body’s independence is warranted. In his post Zuckerberg compared the body’s role to that of Facebook’s board of directors—which has been infamously reluctant to discipline senior executives for the social network’s privacy failings.
That post also noted that Facebook has to make other basic structural choices, starting with how it would select people for the group, how people would petition it and how it would decide which cases to take.
In a talk at a conference on online-speech policies in Washington Friday, former U.S. deputy chief technology officer Nicole Wong, who previously served as a lawyer for Google and then Twitter, voiced unease about the scale of the job.
"They've got 30,000 people working on content issues,” she said. “How does a small panel or a court manage that?”
Wong also commented that one appeals body would struggle with the varying cultural contexts of Facebook’s 2.27 billion monthly users. "It's really hard to capture global nuance when you're a singular body," she said.
Heather West, senior policy manager at Mozilla, also shared some skepticism about having a court of appeals.
“I don’t know that that’s the right model,” she said. “But there almost certainly should be some form of ombudsman.”
Take note, other social networks
Facebook isn’t the only tech giant guilty of poor or arbitrary judgment.
This summer, for example, cat-furniture entrepreneur Jackson Cunningham got an email from Airbnb informing him that the housing-rental site had banned him without explanation or recourse.
“This decision is irreversible and will affect any duplicated or future accounts,” that note read. “Please understand that we are not obligated to provide an explanation for the action taken against your account.”
After Cunningham blogged about this experience, Airbnb relented. He said in an email last week that Airbnb admitted its error “but wouldn't divulge any other details”; he can still only guess that one critical Airbnb host had lashed out at him.
Airbnb did not say if it still won’t offer explanations to customers it bans.
Apple ( AAPL ), in turn, has long reserved the right to refuse or remove iOS apps from its App Store for almost any reason—a level of strict oversight that China’s government has leveraged in requiring Apple to evict virtual-private-network security apps from the Chinese-market App Store.
For the first couple of years, Apple didn’t even document these rules; when it finally posted them, it cautioned app developers not to “run to the press and trash us.”
But historically, the only way around arbitrary enforcement actions—from Apple, Airbnb, Facebook or any other giant tech company—has been to make a fuss in public.
That may be good for news sites that get to break these abuse-of-power stories, but it’s not good for people who don’t get media attention or the public at large.
More from Rob: