Who’s up for the job of decontaminating Facebook?

Font Size


By Kara Swisher

SO, BIG SURPRISE, I have not been asked to be on Facebook’s Supreme Court of content. I was all ready to do an anti-Sherman if called: I will accept if nominated and will serve if elected.

Half of its members were finally announced on Wednesday morning, including four co-chairs, one of whom is Helle Thorning-Schmidt, a former prime minister of Denmark. She is clearly aces in terms of reputation and credibility, one of a slate of 20 members who scream global, fancy résumés, diverse and politically balanced.

Together, the independent organization, which is funded by the social media giant by a trust it cannot mess with, will judge appeals from users on material that has been taken down from the platform by the company, and it will review policy decisions that the company has submitted to the board.

The group selected so far — there are 20 more names to come — is qualified to do all that and a bag of chips. There is a former judge and vice-president of the European Court of Human Rights (Andras Sajo), the former editor in chief of The Guardian (Alan Rusbridger), a Nobel Peace Prize recipient who promoted free speech in Yemen during the Arab Spring (Tawakkul Karman), a vice-chancellor of the National Law School of India University (Sudhir Krishnaswamy), the former director general of the Israeli Ministry of Justice (Emi Palmor) and the leader of Africa’s Internet Without Borders (Julie Owono).

Impressively impressive no doubt, and designed to be that way, which is why it is also nonoffensively nonoffensive.

As yet, there are no loudmouths, no cranky people and, most important, no one truly affected by the dangerous side of Facebook. I asked in a press call on Wednesday morning, for example, why there were no board members like the parents of the Sandy Hook victims, who were terrorized by the conspiracy theorist Alex Jones on the platform until he was finally tossed off. I also asked whether we could find out who turned down an offer to be on the oversight board.

I was given appropriate nonanswers to both those questions, along the lines of we’ll see and there are privacy concerns naming those who said no thanks. Which is why the rollout left me with the nagging feeling — especially after looking over the complex setup that is planned to hear cases — that the oversight board has all the hallmarks of the United Nations, except potentially much less effective.

I am not trying to be glib here, because solving the problem of how to deal with speech across the largest and most unwieldy communications platform in human history is not for the faint of heart. It may be beyond the capabilities of anyone: I am not sure that any deliberative body would be capable of doing so, given that Facebook and its founder and chief executive, Mark Zuckerberg, have purposefully created a system that is ungovernable.

With no real checks and balances built in from the start, the product has become a free-for-all that stresses engagement over context, speed over reflection and viral growth above all.

It’s ironic, of course, that right now the idea of internet virality is much less of a positive thing as COVID-19 rages across the world. But acting like a virus — Facebook spreads and spreads and spreads exponentially — has been the thing that the company has excelled at compared with other social media companies. It’s viral power is what has made it one of the most lucrative and successful organizations on the planet.

Which is to say in the simplest of terms that it is built that way, and Facebook’s problems are structural in nature. It is evolving precisely as it was designed to, much the same way the coronavirus is doing what it is meant to do. And that becomes a problem when some of what flows through the Facebook system — let’s be fair in saying that much of it is entirely benign and anodyne — leads to dangerous and even deadly outcomes.

That is what makes the job for these board members such an overwhelming challenge, since they are charged with dealing with issues on a case-by-case basis. To my mind, that is trying to push back the ocean with one hand.

And perhaps that is why one of its co-chairs, Michael McConnell, called for “patience” as the board developed. A former US federal circuit judge who is also a constitutional law professor at Stanford Law, a religious-freedom and First Amendment expert and a US Supreme Court advocate, Mr. McConnell seemed the most canny of the bunch when describing its role.

“We are not the internet police,” he noted flatly, discussing its role in hearing appeals and being deliberative. But that is perhaps the problem, in that we now have a kind of court, but without the real-time protection of cops where and when most of the damage is done.

Last fall, I wrote a column on the oversight board, hoping that there would be some way to fix this. “Facebook has presented many ideas over the years to limit the toxicity — problematic artificial intelligence monitoring or more troubled systems that employ overwhelmed human moderators or even a blog effort in 2017 called ‘Hard Questions,’ to name a few,” I wrote. “But this latest effort is intriguing — and even laudable. So while the possibility of its becoming a giant PR Potemkin village to assuage critics is very high, it deserves public support.”

I still think that, but I am also less sure that a very smart group of qualified and thoughtful people under pressure from a fire hose of all the awful issues of our digital lives can actually help and rein in what Facebook has wrought.

“We won’t please everyone,” Mr. McConnell also noted. Indeed not.




Kara Swisher, editor at large for the technology news website Recode and producer of the Recode Decode podcast and Code Conference, is a The New York Times contributing Opinion writer.