Opinion
Computing Applications

Beyond the Editorial Analogy: The Future of the First Amendment on the Internet

Can the government regulate how social media companies moderate their platforms?

Posted
preamble of the U.S. Constitution in a pop-up balloon

The First Amendment to the U.S. Constitution prohibits the government, but not private actors, from “abridging the freedom of speech.” State governments have enacted legislation that would apply First Amendment-style protections to the large social media platforms. Later this year the Supreme Court will decide if such laws violate platforms’ own First Amendment rights to decide what content to host. The Court should strike down the challenged laws, but it should also recognize that the government has a strong interest in promoting access to private platforms. The future of free expression on the Internet may well depend on it.

The Laws and Lawsuits

In 2021, Texas and Florida enacted statutes that limit how social media platforms can moderate or otherwise restrict user speech. The Texas law forbids platforms from “discriminating” based on viewpoint when they “censor” users or content—for example by suspending an account or reducing a post’s visibility to other users.a

Florida’s law provides the greatest protections for journalists and politicians: Platforms cannot “censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast” even if the platform applies its content rules “consistently.” The Florida law also forbids platforms from suspending suspending a political candidate’s account for any reason, or even to limit the visibility of content about a political candidate.b

NetChoice, an industry-supported group that advocates for free-market internet policy, sued to prevent the new laws from going into effect. NetChoice won in Florida, where a federal appellate court held the state’s new restrictions on content moderation were unconstitutional and prevented them from going into effect.c But the group lost in Texas, where another federal appellate court upheld the new restrictions.d The Supreme Court agreed to review both cases.

NetChoice argues that platforms such as Facebook, Google, TikTok, and the like have free speech rights of their own that state moderation laws unconstitutionally violate. Specifically, NetChoice argues that when a platform takes some content down and leaves other content up, it is making an “editorial” choice that expresses the platforms’ values. By requiring platforms to host user content to which they otherwise object. NetChoice argues that the Florida and Texas laws violate the general First Amendment prohibition on the government compelling private entities to speak.

Background First Amendment Law

The concept of “editorial” First Amendment rights dates back to Miami Herald v. Tornillo, a 1974 case in which the Supreme Court struck down a law that gave political candidates a “right of reply” in newspapers that criticized their personal character or official record.e

The Tornillo Court reasoned that because newspapers could only print so much content, space given over to state-mandated candidate replies would crowd out newspaper’s preferred speech and incentivize newspapers to tone down their political coverage to avoid triggering the cost and hassle of a mandatory candidate reply. More broadly, the Court held that, even in the absence of limited editorial capacity, a right-of-reply statute would infringe upon the “exercise of editorial control and judgment,” since a newspaper is “more than a passive receptacle or conduit for news, comment, and advertising.”

In the decades following Tornillo, the Supreme Court extended editorial First Amendment rights to other “publishers,” from parade organizers deciding who gets to marchf to public utilities that don’t want to include third-party advocacy in their customer mailings.g

But First Amendment editorial rights have their limits. Not everyone counts as an editor, as the Supreme Court concluded when it prevented a shopping mall from kicking out pamphleteers whose message the mall did not want to associate with.h Similarly, when law schools attempted to ban military recruiters out of a desire to express their condemnation of the “Don’t Ask, Don’t Tell” policy then in place, the Court rejected the schools’ argument that they were exercising editorial discretion as to which recruiters to allow on campus.i

Even where the Court has recognized a party’s editorial role, it has nevertheless sometimes ruled against it. In one case, cable-service providers claimed they had an editorial right to exclude local television channels from the subscription bundles they sold to customers. The Court agreed that bundling a cable package was an editorial activity for First Amendment purposes but then it held the government’s interest in preserving the availability of local television programming trumped that editorial interest.j

Despite the different outcomes these cases reach, they all reflect a concern for the free expression of the speakers and listeners who access the “editor’s” platform. Laws that diminish these speakers’ and listeners’ interests tend to get struck down. Laws that enhance these end users’ interests survive, even in the face of valid “editorial” claims.

Thus, to decide whether social media platforms have First Amendment editorial rights that the Texas and Florida laws impermissibly restrict, the Supreme Court will have to decide not merely whether the laws restrict the platform owners from doing what they want, but whether the laws enhance or restrict the free expression of the millions of platform users.

Effects on User Speech

The Florida and Texas laws justify restricting platforms’ editorial decisions on the ground of advancing the rights of platform users, both those who speak by publishing content and those who listen by consuming the content of others. The problem is not that such an argument is illegitimate, but that these specific laws are badly designed and will very likely backfire.

Encouraging censorship of controversial issues.

First, in requiring “neutral” content moderation, the laws encourage platforms to take down more user content rather than less.

Consider the Texas law, which requires platforms to refrain from discriminating on the basis of viewpoint. “Viewpoint” is a broad term; at least in First Amendment law, it includes a range of offensive and hateful speech: “The proudest boast of our free speech jurisprudence,” the Supreme Court has explained, “is that we protect the freedom to express ‘the thought that we hate.’”k

The problem is that “we” includes advertisers, who also hate this kind of content. Some companies may abandon a platform to avoid being publicly associated with content that threatens their “brand safety.” And even those companies that continue to do business with the platform will at least look for ways to avoid running ads next to posts that contain offensive content. What this means for any platform that sells ads is that every moment spent “protecting the freedom to express the thought [consumers and advertisers] hate” is a missed opportunity to make money.

And so, instead allowing content on “both sides” of a controversial issue—race relations, abortion, or any other of the many fronts of the culture war—platforms may instead simply ban any discussion of such topics. In satisfying their obligations to “neutral” content moderation, they will have harmed, not strengthened, online speech. And worst of all, Texas’ law gives the attorney general—a partisan elected official—the opportunity to selectively enforce this regime.

Driving users away.

A second way the laws could undermine free expression is if they work too well and transform platforms into content free-for-alls.

What critiques of content moderation often forget is that platforms moderate content not primarily out of some ideological crusade but because it keeps users on the platform. In a world of practically infinite, and often offensive content, content moderation, in the words of media scholar Tarleton Gillespie, “is, in many ways, the commodity that platforms offer.”l

What else would explain why a for-profit public company such as Meta spends $5 billion annually and employs 40,000 people to moderate content that it could otherwise monetize? Or why platforms such as X (formerly Twitter) that gut their moderation teams, whether to cut costs or out of ideological commitments, often see decreased user engagement?

We don’t mean to say that it is a problem for users to see a more diverse, even challenging range of content. Instead, the concern is that sloppy moderation drives users away. And when a platform’s user base shrinks, it loses its value to users as a tool for expression.

Here the Florida law is a potential disaster for free expression. In addition to requiring platforms to moderate users and their content “consistently”—a hopelessly vague termit also tells platforms that they cannot “censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast.” Content-neutrality is an even broader requirement than viewpoint-neutrality. It is not just that platforms have to treat all viewpoints on a given topic equallyit is that they have to treat all topics equally as well.

This provision seems to guarantee that Floridians will see content from “journalistic enterprises” whether they want it or not—and the Florida law’s loose definition of “journalistic enterprise” ensures at least some content from fringe media will make it into the mix.

And what content will those fringe media outlets pump out? Consider the Supreme Court’s rule that hate speech represents the “viewpoint” on one side of a debate and “those arguing in favor of racial, color, etc., tolerance and equality” stand on the other side.m Platforms will have to carry the “debate,” in other words, between neo-Nazis and the people who oppose them, and all sides will be entitled to equal treatment. Normal people do not want to see this.

Disabling platforms from taking down harmful or toxic content will not create a free speech utopia; it will create a hellscape that many, if not most, people will flee from.

How the Court Should Rule

These flaws with the Florida and Texas laws are serious, and we would not shed a tear if the Supreme Court invalidated them. But it is important to emphasize why the laws are flawed: not because platforms in and of themselves have a First Amendment right to decide what they host, but because users have a First Amendment right to not have the government ruin social-media platforms. The adage “hard cases make bad law” applies here: despite the Florida and Texas laws’ profound flaws, the Court should not rule overbroadly and give platforms more protections than would be good for society.

Specifically, a narrow decision striking down the laws should still hold out the possibility that other, more limited and carefully written laws, could still pass First Amendment muster. For example, we believe that the Florida law’s limitation on the moderation of politicians’ speech could, if properly scoped, be a net positive for free expression on platforms, given the small number of users it would apply to and the particularly high First Amendment value of electoral speech.

Others may disagree and draw the boundaries of permissible government regulation elsewhere. But the bigger point is this: The editorial analogy simplifies all this complex analysis by zooming out so far from the issues that all policy detail disappears and courts lose track of what free expression is even for. At this altitude, the First Amendment is not much more than an overpowered property right. That is what makes the editorial analogy such a popular deregulatory tool for private businesses that own huge sectors of the expressive infrastructure—and why the Court should approach it with caution.

H.B. 20 (Tex. 2021).
S.B. 7072 (Fl. 2021).
NetChoice, LLC v. Atty. Gen., Fla., 34 F.4th 1196 (11th Cir. 2022).
NetChoice, LLC v. Paxton, 49 F.4th 439 (5th Cir. 2022).
Miami Herald Pub. Co. v. Tornillo, 418 U.S. 241 (1974).
Hurley v. Irish-Am. Gay, Lesbian and Bisexual Group of Boston, 515 U.S. 557 (1995).
P. Gas and Elec. Co. v. Pub. Utilities Commn. of California, 475 U.S. 1 (1986).
PruneYard Shopping Ctr. v. Robins, 447 U.S. 74 (1980).
Rumsfeld v. Forum for Acad. and Institutional Rights, Inc., 547 U.S. 47 (2006).
Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622 (1994).
Matal v. Tam, 582 U.S. 218, 246 (2017)
T. Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. (2018).
R.A.V. v. City of St. Paul, 505 U.S. 377, 391 (1992).

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More