technology

Opinion | Facebook Thought It Was Solving a Problem. It Just Got Handed A Bigger One.

The oversight board made clear that the company can’t shirk its responsibility to set and enforce rules on speech.

The Facebook logo is displayed during the F8 Facebook Developers conference on April 30, 2019 in San Jose, California.

The long-awaited decision on Donald Trump by Facebook’s independent oversight board turned out to be just a passing of the buck. The board, an entity set up by Facebook in 2020 to review disputed content moderation decisions selected by the company and submitted by the public, handed down its highest-profile judgment on Wednesday, agreeing to keep President Donald Trump off the social media platform, but it kicked the ultimate decision back to the company, criticizing the original “indefinite” suspension as arbitrary and demanding that Facebook revisit it within six months.

For Trump critics, it may be a temporary relief to see Trump denied a platform. The board’s decision—which came after a lengthy public comment period during which it received more than 9,000 submissions from all over the world—deserves credit for its attention to human rights principles. It also notably cut against what Facebook executives have stated in the past, arguing that “heads of state and other high officials of government can have a greater power to cause harm than other people.”

And yet, with Trump’s ban upheld, the oversight board sent a message to Facebook that its rules ought to be enforced across the board, even and perhaps especially when broken by world leaders. The decision also makes clear that it’s time for Facebook to contend more seriously with its own policies and supposed values.

For the past decade, I’ve studied the impact of what I refer to as “platform censorship”—that is, the effect that Silicon Valley’s tech platforms have on our free expression. I know, of course, that it is well within the platforms’ First Amendment right to curate their own spaces as they see fit and that Section 230 of the U.S. Code protects these companies from liability for what they choose to leave up or take down. From a purely American legal perspective, this is not censorship.

But in a functional, working sense, it is a kind of censorship. Facebook plays host to around 2.7 billion users—more than twice the population of China—and owns numerous entities across the web, including Instagram and WhatsApp, all of which makes it a formidable authority over the world’s expression. Turning over power to an unaccountable entity to restrict what we can say or what information we can access sure feels like censorship, especially when its reach extends beyond the U.S.

The company regularly denies, sometimes permanently, a voice to all kinds of people—from ordinary users to political activists—with little fanfare and few consequences. Rarely do these individuals have the opportunity to appeal to a person, much less a board, even when the decision is made in error. The effect of such decisions should not be underestimated: For better or worse, Facebook and its products are core to how many people around the world experience the internet. Losing access can deeply affect one’s ability to communicate with others or stay in touch with distant friends and family, and it can even have professionally devastating consequences.

It’s important to note that Trump’s ban is not the first time that a social media company has denied access to a politician—prior examples include Twitter removing white supremacist congressional candidate Paul Nehlen, Facebook blocking the accounts of top Burmese military brass who engaged in hate speech against the Rohingya community, and temporarily suspending Venezuelan leader Nicolás Maduro. Until now, the company has frequently acted in favor of authoritarian states over the needs and rights of the people all over the world, on the grounds that keeping its product available to all is more important than taking a principled stance on freedom of expression.

In recent years, a growing divide has emerged between those who want these platforms to engage in more content moderation, and those who believe that companies should take a step back. While in the U.S., this has often been presented as a partisan struggle, it is in fact one that crosses all kinds of borders and boundaries and lays bare the notion that it’s time to rethink speech governance for the 21st century.

Since the advent of the commercial web, the status quo has been a hodgepodge of U.S. law and self-regulatory practices (based on U.S. speech norms) to which the entire world is subject. But this balance has never worked; an international platform like Facebook needs to take into account the needs of its worldwide user base. This means relying on existing international standards—codified in the International Covenant on Civil and Political Rights—and taking into account the changing global landscape and needs of its users through a process of truly inclusive policymaking.

This is, of course, why the oversight board was created, and thus far, the board has demonstrated itself to be a strong corrective force for a company that has always put profit before people—and for that matter, before principles.

And yet, we should take care not to see the oversight board as the ultimate answer to these questions. We need to think about what it means, more broadly, to give corporations this much power over state leaders and elected officials. The rules regarding what we can say and what information we can access are no longer a creation “of the people” but the decisions of an unelected few.

So while the oversight board rightfully acknowledges the complexity of the Trump case, argues for the consistent application of the rules to all users, and acknowledges that speech from public figures has a greater impact than that of ordinary individuals—things civil society has long argued—it remains a stopgap measure at a time when the company can’t be trusted to act responsibly. Real progress will occur only when Facebook takes human rights into account throughout the entirety of its operations.