Open Forum: Facebook’s empty action on content moderation


In today’s connected world, it is imperative to the health of our democracy that we strike the right balance on online free speech.

The First Amendment makes it clear that, beyond setting and enforcing broad guideposts, the government should not be policing, regulating, or governing online content.

On the other hand, rules and guidelines for moderating content are both necessary and important. Lawless online platforms turn toxic extremely quickly.

That doesn’t mean we should blindly and fully entrust that regulating power to the companies that for so long have treated it as an errant second thought, instead of an integral part of their business practice. Their suspicious proposals tell us they haven’t learned their lessons.

Facebook’s proposed external oversight board is its version of a content “Supreme Court” with independent experts weighing in as the last stop for users in a long appeals process for content removal decisions.

Creating the charter and vision for the Board took an immense amount of time, money and effort from numerous teams of very talented and dedicated people, and worldwide consultations with the public and experts. (Full disclosure: the author was part of a small group of experts engaged with an earlier version of the charter.) It sounds like a significant act of penance by a company defined by privacy, security, and content scandals.

But in its final charter last week, Facebook clarified the concept behind the board and its power. The document makes practically no promise that the board’s input will trigger changes to the company’s content guidelines — the fundamental document against which the board adjudicates cases. Facebook and its executives get to claim clean hands, while holding the smoking gun.

No tech company is excited about making tough, nuanced, and potentially damaging choices about whether a piece of content should be taken down from its platform. Facebook’s move to externalize only this part of its processes should turn even private governance proponents into cautious skeptics.

The board is liable to be deemed, by potential detractors, an unaccountable parallel private judiciary set up by a corporation, powerfully adjudicating on people’s fundamental rights. Never mind that the board will have limited scope and minimal sway over content guidelines — the crucial part of the process of content moderation. Regardless of what Facebook-friendly subject-matter experts may half-heartedly argue, the board will mostly serve as cover from political fire.

Perceived liberal bias aside, Facebook is at its core a corporation whose sole fundamental responsibility is to make its shareholders money, and has done so impressively so far. It can claim to want to pursue high-minded values, and have sustainable goals, but that all goes out the window if its shareholders feel like their money isn’t making them more money.

The dirty secret of the political show surrounding American tech companies and politics, is that it’s all meaningless theater. People like Sen. Josh Hawley, R-Mo., will propose outlandish bills that wouldn’t stand the test of constitutionality even to a first-year law student. In pursuit of positive PR, companies like Facebook will parade their staff and leaders around Congressional offices, continuously proposing expensive alternative solutions. It’s an intricate dance of smoke and mirrors that, in the end, leads to no real solutions.

It’s not all lost, and we should not hastily turn to government censorship, or EU-style regulations on speech if Facebook’s idea will, as is likely, fail to address burning content moderation issues from “shadow-banning” to “deplatforming.” Without asking legislators to overstep their constitutional authority, or wallowing in the untenable status quo, there are plenty of unique structures, tools and knowledge that can help solve content moderation.

Along with Danielle Tomson, I proposed an idea last year that would give the public a voice on specific issues and incorporate it substantively in the whole process, rather than the final adjudication phase. Another perspective is that of councils, composed of experts and laypeople, guided by international human rights standards. Stanford’s experiment with United Nations Special Rapporteur David Kaye on Social Media Councils, among others, shows that this idea can have merit and legitimacy in trying to solve the problem.

Those hopeful for a novel way to tackle this aren’t certain what the solution is. What is clear is that solving the problem of content moderation will not come from pretending a private company is a nation-state underpinned by an illegitimate and functionally impotent pseudo-judiciary.

David Morar is a visiting scholar at the Digital Trade and Data Governance Hub at George Washington University’s Elliott School.





Source link