Trump-Appointed Judge's Ruling Promotes Disinformation by Undermining Social Media Content Moderation
It is telling that, on the nation’s birthday, a Trump-appointed judge handed down a ruling in support of a phenomenon that’s working to undermine democracy.
U.S. District Judge Terry Doughty issued a preliminary injunction that prohibits federal agencies and officials from communicating with social media companies in order to flag problematic, potentially dangerous, content posted on platforms. It is the latest development in a long-running lawsuit Republican attorneys general in several states filed, which alleges efforts taken by the administration to slow the spread of disinformation violate the First Amendment protections of those who peddle it. The administration on Wednesday appealed the ruling.
While I am tempted to view the injunction as one that will lead to more autonomy being put in the hands of social media companies and open the door wider for advocacy groups that seek solutions to the disinformation epidemic, I don’t think it’s that. I think it’s a partisan decision that signals to pro-disinformation interests that they should use the courts as an instrument to put more pressure on social media companies to stop moderating content altogether.
Recall that the Trump administration issued an executive order designed to erode the immunity social media companies have from content posted on their networks. The order was ineffective and did not change existing precedent, but it did telegraph the GOP’s desire to force social media companies to end content moderation.
The Doughty injunction is, potentially, another way for them to get there.
It is dangerous because it further accelerates the false equivalency between actual censorship, in which government entities use their power to suppress speech, and content moderation, in which private companies make business decisions that keep their platforms enjoyable. These platforms are not, in the words of Twitter owner Elon Musk, town squares. Town squares are public places. The platforms are bars, which can toss out unruly and disruptive patrons whenever they want. The injunction challenges their ability to do that.
My understanding is that, if you ask them off the record, social media executives, with the obvious exception of Musk and Twitter CEO Linda Yaccarino, want the government’s involvement in how platforms are moderated. Their investors certainly do, because disinformation, particularly when deployed at scale, ruins the health of the dialogue on those platforms and harms revenue. We’ve seen that with Twitter.
That’s because the product social media companies ship isn’t an app to post things on. The product is the moderation. It is what defines the user experience for both people and advertisers.
The argument that the injunction implies—that disinformation is lawful speech, and that its removal by a private company constitutes the unconstitutional suppression of speech by the government—is wrong and dangerous. If it becomes the statute, then social media companies are going to be forced to stop moderating, which is the game plan here. The higher court’s response to Doughty’s injunction will tell us if this plot is going to work.
Disinformation is not democratic. Dissent is democratic. Disinformation is how weak people cling to power when they can’t win on the basis of their ideas. And it is not enough for brands to promote accurate information. They also need to actively fight disinformation. Sliding into a post-truth world would, among a bunch of other bad things, make it impossible for corporations to conduct business with any certainty or predictability.
What can companies do about all of this? They can leave platforms that fail to institute clear and consistently enforced moderation that removes disinformation and de-platforms repeat offenders. They can give money to nonprofits that research disinformation sources as well as ways to suppress it. And they can promote the work of these groups and provide non-monetary resources.