This is more than a free speech crisis

It’s a humanity crisis, says Matt Taibbi

Matt Taibbi, a journalist who has been investigating how the FBI and other US authorities influenced Twitter during the pandemic to uphold a certain “narrative”, spoke recently to a Free Speech gathering in London. Below is an excerpt of what he said. The full text is here.

[. . .]

But after looking at thousands of emails and Slack chats, I first started to get a headache, then became confused. I realized the old-school Enlightenment-era protections I grew up revering were designed to counter authoritarianism as people understood the concept hundreds of years ago, back in the days of tri-cornered hats and streets lined with horse manure.

What Michael and I were looking at was something new, an Internet-age approach to political control that uses brute digital force to alter reality itself. We certainly saw plenty of examples of censorship and de-platforming and government collaboration in those efforts. However, it’s clear that the idea behind the sweeping system of digital surveillance combined with thousands or even millions of subtle rewards and punishments built into the online experience, is to condition people to censor themselves.

[. . .]

But it goes deeper. Michael and I found correspondence in Twitter about something called the Virality Project, which was a cross-platform, information-sharing program led by Stanford University through which companies like Google, Twitter, and Facebook shared information about Covid-19.

They compared notes on how to censor or deamplify certain content. The ostensible mission made sense, at least on the surface: it was to combat “misinformation” about the pandemic, and to encourage people to get vaccinated. When we read the communications to and from Stanford, we found shocking passages.

One suggested to Twitter that it should consider as “standard misinformation on your platform… stories of true vaccine side effects… true posts which could fuel hesitancy” as well as “worrisome jokes” or posts about things like “natural immunity” or “vaccinated individuals contracting Covid-19 anyway.”

This is straight out of Orwell. Instead of having “ambiguities” and “shades of meaning” on Covid-19, they reduced everything to a binary: vax and anti-vax.

They eliminated ambiguities by looking into the minds of users. In the Virality Project if a person told a true story about someone developing myocarditis after getting vaccinated, even if that person was just telling a story – even if they weren’t saying, “The shot caused the myocarditis” – the Virality Project just saw a post that may “promote hesitancy.”

So, this content was true, but politically categorized as anti-vax, and therefore misinformation – untrue.

[. . .]

“This continual process of seeding doubt and uncertainty in authoritative voices,” wrote Graphika, in a report sent to Twitter, “leads to a society that finds it too challenging to identify what’s true or false.”

It was the same with someone who shared true research about the efficacy of natural immunity or suggested that the virus came from a lab. It all might be factual, but it was politically inconvenient, something they called “malinformation.” In the end, out of all of these possible beliefs, they derived a 1984 binary: good and ungood.

They also applied the binary to people.

This was new. Old-school speech law punished speech, not the speaker. As a reporter I was trained that if I commit libel, if I wrote something defamatory that caused provable injury to someone, I would have to retract the error, admit it, apologize, and pay remuneration. All fair!  But the court case wouldn’t target me as a person. It wouldn’t assume that because I was wrong about X, I would also be wrong about Y, and Z.

[. . .]

One last note. As Michael and I found out recently with regard to the viral origin story, things deemed politically good often turn out to be untrue, and things deemed ungood turn out to be true.

I can recite a list if need be, but many news stories authorities were absolutely sure about yesterday later proved totally incorrect. This is another characteristic Orwell predicted: doublethink.

He defined doublethink as “the act of holding, simultaneously, two opposite, individually exclusive ideas or opinions and believing in both simultaneously and absolutely.”

Not long ago we were told in no uncertain terms the Russians blew up their own Nord Stream pipeline, that they were the only suspect. Today the U.S. government is telling us it has known since last June that Ukrainian forces planned it, with the approval of the highest military officials. But we’re not expected to say anything. We’re expected to forget.

What happens to a society that doesn’t square its mental books when it comes to facts, truth, errors, propaganda and so on? There are only a few options. Some people will do what some of us in this room have done: grow frustrated and angry, mostly in private. Others have tried to protest by frantically cataloging the past.

Most however do what’s easiest for mental survival. They learn to forget. This means living in the present only. Whatever we’re freaking out about today, let’s all do it together. Then when things change tomorrow, let’s not pause to think about the change, let’s just freak out about that new thing. The facts are dead! Long live the new facts!

We’re building a global mass culture that sees everything in black and white, fears difference, and abhors memory. It’s why people can’t read books anymore and why, when they see people like Russell who don’t fit into obvious categories, they don’t know what to do except point and shriek, like extras in Invasion of the Body Snatchers.

We have been complaining about censorship, and it’s important to do that. But they are taking aim at people in a way that will make censorship unnecessary, by building communities of human beings with no memory and monochrome perception. This is more than a speech crisis. It’s a humanity crisis. I hope we’re not too late to fix it.