Opinion
Society

Do Social Media Platforms Have Free Speech Rights to ‘Censor’ Conservatives?

Can states force social media platforms to stop removing lawful-but-awful postings?

Posted
censored media, social media icon

Fed up with major social media platforms’ frequent decisions to remove or deprioritize postings by Florida and Texas conservatives, their state legislators passed laws forbidding these platforms from “censoring” their users’ online postings.

NetChoice, an industry organization whose membership includes many social media firms and whose mission is to make the Internet safe for free enterprise and free expression, and the Computer & Communications Industry Association (CCIA), sued the attorneys general of those states (Ashley Moody in Florida and Ken Paxton in Texas, officials responsible for enforcing these laws). NetChoice and CCIA asked federal courts in Florida and Texas to declare these laws unconstitutional under the First Amendment to the U.S. Constitution. (To keep things simple, the remainder of this column speaks only of NetChoice.)

Trial court judges in both cases agreed with the First Amendment challenges. In the Moody case, an appellate court largely affirmed the lower court’s ruling. However, in Paxton, a different appellate court reversed the lower court and upheld the Texas law. It agreed with Paxton that the Texas anti-censorship law protected the First Amendment interests of Texas conservatives.

Moody and NetChoice petitioned the U.S. Supreme Court to review the appellate court rulings against them. Because these federal courts had adopted inconsistent interpretations of the First Amendment as applied to social media platforms, the Supreme Court decided to hear both appeals.

The Court’s decision addressed both constitutional challenges in one opinion. The Court sent the cases back to lower courts for further proceedings consistent with its decision. Justice Kagan’s opinion for the Court recognized that social media platforms, like newspapers, have First Amendment rights to exercise editorial discretion about what contents to publish or withhold from publication.

Although NetChoice won a significant victory in the Court’s ruling, it will have to produce a stronger evidentiary record to support its First Amendment challenge and address the scope of the laws.

Florida’s and Texas’ Social Media Anti-Censorship Laws

The Florida and Texas laws have in common that they prohibit large social media platforms from “censoring” their users’ online postings. Both laws define the term “censor” broadly. The Florida law, for example, says it includes any action to “delete, regulate, restrict, edit, alter, inhibit publication or republication of, suspend a right to post, remove, or post addendum to any content or material posted by a user.” The Texas statute defines “censor” to mean “to block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.” The Florida law also bars deplatforming of Florida politicians. Violating these anti-censorship rules would put the platforms at risk of costly litigation and sanctions.

Both laws characterize large social media platforms (a term defined somewhat differently in each law) as “common carriers.” When governments designate certain services, such as telephonic and railroad carriage, as common carriers, it imposes non-discrimination obligations on providers of those services. (That is, common carriers must offer their services to all comers on the same terms and cannot deny services to those with whose views it may disagree.)

The Florida and Texas laws also require the social media platforms to disclose various details about their operations and content moderation policies and practices.

Relevant First Amendment Principles

While the Justices did not opine on the constitutionality of either law, Justice Kagan’s opinion for the Court articulated three core First Amendment principles that the lower courts should heed when reconsidering the NetChoice challenges.

The first is that the First Amendment applies whenever a private actor compiles and curates the speech of others. It also applies when that actor decides to exclude some content from the compilation or to order the contents in a particular way.

If government officials interfere with the exercise of private actors’ editorial judgments, courts will find the interference unconstitutional. The Court indicated that social media platforms’ enforcement of its content moderation rules (for example, against hate speech or bullying) by removing certain content or deprioritizing it is an exercise of First Amendment protected editorial judgment.

A second principle is that the first principle applies even if social media companies typically publish very large volumes of content and remove very few submissions. (Florida and Texas argued that social media platforms do not exercise editorial judgment because they overwhelmingly allow postings to stay up on their sites and do not explain takedowns.)

A third principle is that governments cannot justify regulating social media content moderation by asserting that they have legitimate interests in balancing points of view expressed by those media or their contributors. “However imperfect the private marketplaces of ideas” may be, the government cannot coerce private actors that host others’ speech “to provide more of some views or less of others.”

In support of these propositions, the Court cited its prior ruling in Miami Herald v. Tornillo, which struck down a Florida “right to reply” law. Even though the Miami Herald was a powerful newspaper that legislators thought presented unbalanced views on some issues, the First Amendment precluded Florida from forcing the newspaper to publish critics’ letters to the editors.

Hence, the Court in Moody ruled that neither Florida nor Texas can justify their anti-censorship laws by claiming that large social media platforms are treating conservatives unfairly because their postings are often removed or deprioritized.

Viewpoint Regulation Is Constitutionally Suspect

The Court’s Moody opinion characterized the Texas law as a constitutionally suspect “viewpoint-based regulation” of speech. “If the Texas law is enforced, the platforms could not—as they do now—disfavor posts because they support Nazi ideology; advocate for terrorism; espouse racism, Islamophobia, or anti-semitism; glorify rape or other gender-based violence; encourage teenage suicide or self-injury; discourage the use of vaccines; advise phony treatments for diseases; advance false claims of election fraud.” Some Internet commentators call these types of content “lawful-but-awful” speech.

The Florida and Texas anti-censorship laws are aimed at preventing social media platforms from taking down lawful-but-awful speech posted by conservative commentators. Neither law would affect social media platforms’ decisions to take down user-posted content that violates the law, such as defamatory postings or threats to kill a specific individual.

As deplorable as much lawful-but-awful speech may be, governments in the U.S. cannot regulate it because the First Amendment protects that speech. (In many other countries, hate speech can be outlawed.) Privately owned social media platforms, by contrast, do not have to worry about the First Amendment when they decide to remove or deprioritize certain types of user postings. If their community guidelines or terms of service forbid such speech, platforms are free to remove or otherwise limit access to or label lawful-but-awful speech without First Amendment constraints.

Facial or As Applied Constitutional Challenges?

NetChoice decided to make what lawyers call a “facial challenge” to the constitutionality of Florida and Texas laws. When lawsuits allege that a law is unconstitutional “on its face,” they must show that there are few, if any, circumstances under which the law can be applied consistent with the Constitution. Sometimes lawsuits challenge a law’s constitutionality “as applied” to some persons under specified circumstances.

The Supreme Court’s 1998 decision in Reno v. ACLU is an example of a successful facial challenge to a law that regulated speech on the Internet. The Communications Decency Act (CDA) would have imposed criminal liability on anyone who posted content on the Internet that was indecent or obscene.

While courts have ruled that governments can constitutionally regulate content that is indecent as to children, they have also ruled that adults have constitutional rights to engage in indecent speech. The Court decided that the ACLU had provided ample evidence that this law, which Attorney General Janet Reno had been tasked to enforce, was unconstitutional on its face.

An “as applied” constitutional challenge might target a law that has legitimate purpose and would be constitutional under most circumstances, but perhaps not as to others. Consider a law that forbids littering in public parks. Although such a law would be constitutional as applied to those who throw their trash on the ground, the application of an anti-littering law might be unconstitutional if enforced against protestors who were distributing leaflets about their causes.

In deciding to make facial challenges to the Florida and Texas laws, NetChoice relied on Reno v. ACLU and other court decisions that have ruled that privately owned media entities, such as newspapers and cable television stations, have constitutional rights to decide whether or not to publish content authored by outsiders.

NetChoice argued that social media companies have the same constitutional rights as traditional media firms. So if newspapers have constitutional rights to exercise editorial discretion to decide against publishing, for example, a critical letter to the editor, social media firms should have equivalent rights to decide not to host content that, for example, violates its community guidelines.

Why Courts Are Reluctant to Grant Facial Challenges to Laws

The role of legislatures in democratic societies is to pass laws that legislators believe are in the best interests of their constituents. For the most part, courts presume that duly enacted laws are constitutional. Those who mount facial challenges must show that a substantial majority of applications of the targeted law would violate the First Amendment.

When the ACLU in the 1998 Reno case made a facial challenge to the CDA’s Internet indecency provision, it presented courts with a substantial body of evidence to show that this law would be unconstitutional as applied to many speakers in many contexts.

In the Moody case, by contrast, NetChoice sought a preliminary injunction to stop the law from going into effect without providing more than a few sworn statements to support its unconstitutionality claim. Although NetChoice provided more evidence of unconstitutionality in Paxton, all nine U.S. Supreme Court Justices believed that the evidence was too sparse to support a facial challenge to the Florida and Texas laws. This is the main reason the Justices sent the cases back to the lower courts.

Questioning the Scope of the Florida and Texas Anti-Censorship Laws

A second reason the Court sent the Moody and Paxton cases back to lower courts is to analyze the scope of potential applications of the Florida and Texas laws. These laws certainly apply to Facebook, YouTube, and Twitter/X. But do they apply to other platforms as well?

NetChoice, Moody and Paxton made strategic decisions to focus their constitutional analyses as though they only applied to the biggest and best known social media platforms. The trial and appellate courts did not challenge the litigants’ characterizations of the scope of the two laws. However, some of the Justices thought the laws’ statutory definitions of “social media platforms” might be broad enough to apply to other types of platforms or to some services available to users of the major platforms (such as direct messaging).

During oral argument, some of the Justices asked lawyers whether the Florida and Texas laws would apply to entities such as Etsy, Venmo, and other online services. They wondered about the First Amendment implications of these laws if their scope was broader than had been litigated thus far.

The Court criticized the litigants and lower courts for failing to consider the “full range of activities the laws cover” and to assess “the constitutional versus unconstitutional applications.” On remand, it directed the lower courts to assess the scope of these laws before assessing the First Amendment claims raised by the litigants.

Conclusion

A definitive ruling on the constitutionality of the Florida and Texas social media anti-censorship laws is likely several years away. Between now and then, the lawyers must do a great deal of work to compile an evidentiary record to support their views about the effects these laws would have in a wide variety of contexts.

NetChoice bears the burden of proving the unconstitutionality of the two state laws. That burden will not be easy to bear. But at least the Supreme Court’s Moody decision has recognized that social media platforms have constitutional rights to decide what user-posted contents they can take down or leave up without state law interference.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More