Facebook is better without Trump

Greg Bensinger
Published : 19 March 2021, 10:03 PM
Updated : 19 March 2021, 10:03 PM

If you are a public official, there's no more effective or efficient place to lie than on Facebook. It's company policy — meaning the policy of the chief executive, Mark Zuckerberg — to roll out the red carpet to all manner of political falsehood and obfuscation.

Zuckerberg has said that it's not the company's job to "be arbiters of truth" and that allowing posts from well-known people allows the public to make informed decisions. Yet every day Facebook blocks or deletes posts from Average Joes who violate its policies, including propagating untruths and hateful speech.

Facebook made the right decision to indefinitely ban Donald Trump from contributing to the site following his dangerous (and policy-violating) posts inciting January's terrifying blitz on the Capitol.

The company's outside oversight board — a hand-picked, global set of scholars, journalists, politicians and other luminaries — is reviewing the suspension and will rule in the coming weeks. The board should uphold the decision to keep Trump off the site.

If the oversight board were to restore Trump's account, it would stand as an affirmation of Facebook's self-serving policies permitting the most divisive and engaging content to remain and a clarion call to leaders like Rodrigo Duterte and Jair Bolsonaro, who have similarly peddled in misinformation, to keep on posting.

"Facebook created this engine of amplification. They know exactly how widely these posts can spread and why they should stand in the way," said Ryan Calo, a University of Washington law professor. "When people violate their rules, they should all be held to the same standards."

In other words, when rules are enforced inconsistently, why should anyone respect them?

It's not as if Facebook didn't have ample evidence that its site could and would be used to incite real-world violence. Left to its own devices, the company allowed bigoted and provocative posts to remain, such as Trump's threat to protesters after George Floyd's death that "when the looting starts, the shooting starts," an external audit found.

Even two years into Trump's term, Facebook admitted it hadn't done enough to prevent its site from being used "to foment division and incite offline violence." But nothing much changed.

So, Trump most likely felt emboldened after spending years flouting Facebook's rules about election misinformation, the pandemic and the glorification of violence with only feeble blowback from the company. In just his final year in office, roughly a quarter of his 6,081 posts contained misinformation, lies or harmful rhetoric, according to the liberal watchdog group Media Matters for America. Abroad, Facebook has been used by politicians to promote the harming of Filipino citizens, the destruction of mosques and a genocide of the largely Muslim Rohingya in Myanmar.

Incitement is, as they say in Silicon Valley, a feature, not a bug.

Facebook and other social media sites' caution about taking down posts or accounts in democratic elections may be understandable, but prominent people are more likely to be believed, which is why the company's standards should be higher for them, not the other way around. There is a growing body of evidence that far from being dispassionate, Facebook's software algorithms are designed to amplify and more broadly spread untrustworthy or extreme content, essential to keeping users on the site longer, where they can see more lucrative advertisements.

But removing accounts that repeatedly violate the social media sites' norms has proved to be an effective way to stop the spread of hateful or dishonest content. Hoping to both have and eat their cake, Facebook and Twitter tried labelling problematic posts with warnings and links to other sites, which few people notice, while doing little to stop the posts' dissemination.

Ruling for a continued ban of Trump's account would also inject the oversight board with much-needed legitimacy, after it was given a narrow and squishy mandate. For instance, policy recommendations are considered advisory, which rankles some members. And the board for now has jurisdiction to rule only on whether posts were improperly removed, not on whether offending posts should be taken down (Facebook must reinstate posts the board rules should be restored).

That means the board could, for instance, agree that Trump's suspension is warranted, but lay out conditions or updated policies under which he could eventually return — all of which Facebook could choose to ignore. A forceful ruling, backed by Facebook management, would ensure that additional account removals, such as that of the Myanmar military, would be protected.

There are legitimate concerns over whether the suspension of Trump reflects the immeasurable power amassed by technology firms to control and guide public discourse. There is rare bipartisan agreement that companies like Facebook and Google must be subject to greater regulation and, perhaps, even split up.

But the law is clear that Facebook is exercising its own First Amendment rights to regulate speech on its own site, including from the president. Sadly, it took four years of Trump's divisive posts and bald attempts to undermine our democracy — not to mention a new administration — for Facebook to act on that.

The Facebook oversight board has the opportunity to defend the sanctity of the democratic process and draw a line in the sand for those who, like Trump, would undermine it with false claims that an election was stolen or fraudulent. Upholding the former president's social media ban would go a long way toward achieving that.

©2021 The New York Times Company