Times-Herald

Curing the disinformation pandemic

Steven Roberts

Facebook is like Coca-Cola. Both companies want everyone to use their products; the last thing they want is any turmoil that gives customers a reason to leave the website or switch to Dr Pepper.

But here's the difference: Drinking Coke can't hurt you (unless, of course, you swill gallons of the stuff). Facebook, however, can poison your mind with toxic disinformation. As Margaret Sullivan of The Washington Post reports, Donald Trump "spread false information with devastating consequences for the country" in more than 1,400 Facebook posts in the year 2020.

Which raises a critical question: What should be done about polluters like Trump who use Facebook (and other social media platforms like Twitter) to contaminate the information environment? Should they be regulated? Banned entirely? If so, who makes those decisions, and what standards should they use?

Even out of office, Trump is forcing Facebook, and indeed the entire country, to look for answers. They are in very short supply, and here's why: The debate does not present a clear trade-off between good and bad. It involves a clash of two vital American values: the right to speak freely and the right to live safely. What's the proper balance between the two? How can both be preserved? As Shakespeare writes, "There's the rub."

For most of its existence, Facebook pretended there was no problem. We are like the phone company, insisted founder Mark Zuckerberg — a public utility that merely transmits information. We don't edit or control that information. That was always a ridiculous notion, and it became completely untenable as Facebook grew in power and reach. It now has more than 2.7 billion users.

"For better or worse, Facebook and its products are core to how many people around the world experience the internet," Jillian York of the Electronic Frontier Foundation writes in Politico.

But Zuckerberg did not want the responsibility of policing his own platform. Any decisions he made about limiting access would make some folks angry. Facebook, he was fond of saying, "should not be the arbiter of truth." Government should take that role. But a highly polarized Washington has been totally incapable of crafting effective regulations, leaving the social media field devoid of umpires or referees.

That was great for Trump, who used the platform to spread his disinformation during the 2016 campaign, unhindered by any restrictions. As his campaign manager Brad Parscale told "60 Minutes" on CBS: "I understood early that Facebook was how Donald Trump was going to win ... it was the highway in which his car drove on" (sic).

During Trump's presidency, Facebook remained a major weapon in his arsenal. Zuckerberg tentatively tried to apply minimal controls — some Trump posts received warning labels, a few were taken down — but nothing really impeded his flow of falsehoods.

So Zuckerberg tried something else, creating an independent oversight board filled with noted legal and media experts. It became operational last fall, and was designed to shoulder the "arbiter" role Facebook itself was desperate to avoid.

A few weeks later, Jan. 6 happened. Trump urged his followers to storm the Capitol, and Zuckerberg was forced to act. Even though Facebook users should have "the broadest possible access to political speech," he wrote, "the current context is now fundamentally different, involving use of our platform to incite violent insurrection."

Facebook banned Trump indefinitely (Twitter did permanently), and Zuckerberg tasked his new Oversight Board with reviewing the decision. The board upheld Trump's ban for six months, but bucked the final decision back to Facebook: "In applying a vague, standardless penalty and then referring this case to the Board to resolve," read the group's statement, "Facebook seeks to avoid its responsibilities."

Almost everyone was unhappy. "Censorship," cried Trump and his supporters. "Cowardice," replied the liberals who wanted Trump banned forever.

As for Zuckerberg, the board pegged him exactly right. He's been trying to avoid responsibility for years — and in some ways, that's a good thing. Should a 36year-old with absolutely no credentials as an ethical or editorial expert be the "arbiter of truth" for 2.7 billion Facebook users? The answer is clearly no.

But is government a better option? Should politicians be making those judgments? The whole idea leaves me queasy. Some version of an oversight board, a Supreme Court-like body that renders dispassionate, independent judgments that reconcile free speech and public safety, is probably the best alternative.

But as a country, as a global community, we are far from finding a vaccine for the disinformation pandemic. No one wants to make unpopular decisions. Everyone wants to keep selling Coke.

(EDITORS NOTE: Steven Roberts teaches politics and journalism at George Washington University. He can be contacted by email at stevecokie@gmail.com.)

Opinion

en-us

2021-05-18T07:00:00.0000000Z

2021-05-18T07:00:00.0000000Z

https://thnews.pressreader.com/article/281599538394097

Alberta Newspaper Group