Today’s Supreme Court Hearing Addresses a Far-Right Bogeyman

For years, government agencies have flagged misinformation and harmful content to platforms. The Supreme Court’s ruling in Murthy v. Missouri could change all that.
Collage of a gavel the U.S. Capitol Building social media icons and phones
Photo-illustration: Jacqui VanLiew; Getty Images

Today, the US Supreme Court will hear a case that will determine whether the government can communicate with social media companies to flag misleading or harmful content to social platforms—or talk to them at all. And a lot of the case revolves around Covid-19 conspiracy theories.

In Murthy v. Missouri, attorneys general from Louisiana and Missouri, as well as several other individual plaintiffs, argue that government agencies, including the Centers for Disease Control (CDC) and the Cybersecurity and Infrastructure Security Agency (CISA), have coerced social media platforms to censor speech related to Covid-19, election misinformation, and the Hunter Biden laptop conspiracy, among others.

In a statement released in May 2022, when the case was first filed, Missouri attorney general Eric Schmitt alleged that members of the Biden administration “colluded with social media companies like Meta, Twitter, and YouTube to remove truthful information related to the lab-leak theory, the efficacy of masks, election integrity, and more.” (The lab-leak theory has largely been debunked, and most evidence points to Covid-19 originating from animals.)

While the government shouldn’t necessarily be putting its thumb on the scale of free speech, there are areas where government agencies have access to important information that can—and should—help platforms make moderation decisions, says David Greene, civil liberties director at the Electronic Frontier Foundation, a nonprofit digital rights organization. The foundation filed an amicus brief on the case. “The CDC should be able to inform platforms, when it thinks there is really hazardous public health information placed on those platforms,” he says. “The question they need to be thinking about is, how do we inform without coercing them?”

At the heart of the Murthy v. Missouri case is that question of coercion versus communication, or whether any communication from the government at all is a form of coercion, or “jawboning.” The outcome of the case could radically impact how platforms moderate their content, and what kind of input or information they can use to do so—which could also have a big impact on the proliferation of conspiracy theories online.

In July 2023, a Louisiana federal judge consolidated the initial Missouri v. Biden case together with another case, Robert F. Kennedy Jr., Children's Health Defense, et al v. Biden, to form the Murthy v. Missouri case. The judge also issued an injunction that barred the government from communicating with platforms. The injunction was later modified by the 5th Circuit Court of Appeals, which carved out some exceptions, particularly when it came to third parties such as the Stanford Internet Observatory, a research lab at Stanford that studies the internet and social platforms, flagging content to platforms.

Children’s Health Defense (CHD), an anti-vaccine nonprofit, was formerly chaired by now presidential candidate, Robert F. Kennedy, Jr. The group was banned from Meta’s platforms in 2022 for spreading health misinformation, like that the tetanus vaccine causes infertility (it does not), in violation of the company’s policies. A spokesperson for CHD referred WIRED to a press release, with a statement from the organization’s president, Mary Holland, saying “As CHD’s chairman on leave, Robert F. Kennedy Jr. points out, our Founding Fathers put the right to free expression in the First Amendment because all the other rights depend on it. In his words, ‘A government that has the power to silence its critics has license for any kind of atrocity.’”

Different arms of the government have been in touch with social media companies for years, particularly regarding threats to elections or emergencies, like the Covid-19 pandemic. In the wake of Russian interference in the 2016 presidential election, Meta CEO Mark Zuckerberg said that the company was “actively working with the US government on its ongoing investigations.”

In a statement before the House Judiciary Committee in 2020, FBI director Christopher Wray noted the ways in which terrorist organizations and foreign countries could weaponize social media to spread disinformation and undermine trust in democratic institutions. “Over the last year, the FBI has met with top social media and technology companies several times, provided them with classified briefings, and shared specific threat indicators and account information, so they can better monitor their own platforms,” he said at the time.

But there's a difference between providing information and requiring content moderation.The US solicitor general’s brief even notes that the plaintiffs were unable to “point to any evidence that the government ever imposed any sanction when platforms declined to moderate content the government had flagged.”

The case could not be surfacing at a more critical moment. Generative AI has only amplified existing election threats when it comes to disinformation, even as tech companies have shrunk their trust and safety teams and rolled back some of their earlier protections.

Senator Mark Warner of Virginia highlighted the Murthy v. Missouri case as one of the four major threats to election integrity going into the 2024 presidential election. “All throughout the Trump administration there was voluntary sharing [between the government and social media platforms], so if the NSA or CISA found evidence of foreign malign influence, that could be shared,” said Warner.

David Greene, of the Electronic Frontier Foundation, says the most likely outcome is that the court will issue a new test to assess whether the government was, in fact, coercive against social platforms, and pass the case back down to a lower court to rule on again.

“I think the Supreme Court will look at the nature of how the information is transmitted rather than what the content is itself,” he says.