Opinion
Society

Free Speech vs. Free Ride: Navigating the Supreme Court’s Social Media Paradox

Regulating platforms.

Posted
protest, bus in background

The Supreme Court has an analogy problem. Are social media more like publishers, who have free speech editorial rights and liability for their decisions? Or are they more like common carriers, who serve everyone and hold no liability for what their users post? The need for content moderation inclines the Court toward the publisher analogy2 but this trade-off is hard, so hard that it just sent a second pair of cases back for lower courts to clarify.

Social media firms want the best of both. They want publishers’ freedom to exclude users and control over editorial decisions but common carrier protection from user content liability. They argue the former in Moody v. NetChoicea and NetChoice v. Paxton,b where Florida and Texas sought to prevent them from discriminating against conservative speech. They argue the latter in Twitter v. Taamnehc and Gonzalez v. Google,d where they stand accused of failing to take meaningful action to thwart terrorists who use their services to “recruit members, plan … attacks, [and] issue … threats.” No other industry is so privileged—free to decide how it operates yet free of decision consequences. Unlike the print and broadcast industries, and most others, they do not even incur production costs; you and I as users and content creators do.

This has created an aura of impunity rejected by other nations who face the same conundrum, balancing free speech with freedom from consequence. France arrested Pavel Durov, founder of social media platform Telegram, on charges of facilitating drug trafficking, money laundering, distribution of child sexual abuse material (CSAM), and refusal to provide data on perpetrators to authorities.9 Japan is crafting laws to hold Facebook accountable for real ads with deepfake celebrities scamming users from their savings.7 Brazil blocked access to X (Twitter) for failing to shut down accounts that threaten judges, promote insurrection, and deny the last election.8 Like Telegram, X (Twitter) had ignored judicial requests for user data. Legitimate concerns argue for protecting speech against political interference yet no country’s laws, not even the U.S. First Amendment, protect speech used in the commission of crime.

A third Supreme Court analogy—one preferred in the NetChoice decisions—is the “marketplace of ideas.” This highlights competition as the best test of truth, but it raises a new question: when do markets fail? One failure is monopoly. Social media platforms are not just speakers in the market, they are the market. Across the Western world, three of the top five user bases are governed by a single social media firm.eWe do not let Amazon disadvantage competing products on Amazon. We should not let Facebook disadvantage competing ideas on Facebook. The traditional remedy for press monopoly, launching a competing press, just isn’t an option when network effects protect incumbent social media. Mighty Google has entered social media at least three times and failed.6

Another failure in the marketplace of ideas is negative externality, the damage inflicted indirectly on others. In classic terms, this is pollution such as foul air, poisoned water, and contaminated soil. In social media terms, it manifests as insurrection, lynching, suicide, sex trafficking, drug trafficking, child exploitation, judicial intimidation, and terrorist recruiting—damages occurring off-platform that social media firms do not themselves experience. Like the Industrial giants a century ago, today’s Internet giants have sought to avoid the pollution costs they impose on the rest of us.

The Court has made clear, and rightly so, that government is not the answer. As Justice Kagan wrote in the NetChoice decisions “it is no job for government to decide what counts as the right balance of private expression—to ‘un-bias’ what it thinks is biased, rather than to leave such judgments to speakers and their audiences.”f

A fourth analogy—a city parade—however, suggests an answer to the problem of designing better policy if Congress and the courts can act. This analogy emphasizes listener rights.

Social media put forward a powerful free speech case backed by Supreme Court precedent. In Hurley,g the Court found that a city or state cannot force private organizers of a parade to grant participation to parties whose message organizers find distasteful. This, platforms argue, gives them the right to edit or exclude. Hurley is essential law, yet platforms misapply the analogy. Users arrive at Facebook or Telegram not to hear from Facebook or Telegram but from other users. Influencers organize their content parades not for Facebook or Telegram but for their followers. The platform provides the streets and city park. As traffic cop and sanitation engineer, the platform issues warnings, removes bad drivers, and clears the mess. But, the city, as we know, must not censor citizens. The lesson of Hurley then is that users get to organize their own parades without Facebook or Telegram interference.

Kagan’s quote is correct and profound. The right analogy only needs a shift in perspective. Judgment must be left to “speakers and their audiences.” Users are social media’s speakers and users are social media’s audiences and it is they who should decide.

Both NetChoice cases hinged on suppressing conservative speech, but suppressing speech on either side is wrong. When President Joe Biden stepped aside and named Vice President Kamala Harris his nominee in the 2024 presidential race, X (Twitter) blocked new followers from viewing Harris’ messages. This prompted Congressman Jerrold Nadler to inquire whether Musk, as X (Twitter) owner, had moved to throttle her followers.h Musk has endorsed Trump, her political opponent and even shared unlabeled fake video, using her own voice to mock her, in apparent violation of X (Twitter) policy.1 Throttling appears to have been technical, rather than deliberate5—the massive surge in legitimate Harris interest seems to have tripped safeguards to prevent bot followers—but it raises a deeper concern. The extreme view, of platform as publisher, entitles Musk to throttle new access to Harris. It would even let Musk deliberately cleave Harris from her existing followers. It cannot be the case that allowing a platform to separate a speaker, left or right, from a listener that has elected to hear that speaker serves free speech interests.

A platform established to enable free association that forbids free association based on viewpoint is a contradiction in First Amendment terms.

If Congress and the courts grant listeners the right to choose their speakers, the common carrier analogy applies. Anyone can post any legal content. No one is excluded and no viewpoint disadvantaged. Listener rights promote autonomy and equality, freedom to explore and freedom from undue influence.3 At the same time, listeners gain the right to choose their own organizing principle to reduce their pollution costs. They could choose any filter supplied by the BBC or Breitbart, one offered by a startup, by Facebook or Telegram, all-the-above, or none at all. Reddit already implements a version of user choice at the group, not individual, level allowing different subreddits to exercise different content policies. Now, individual listeners gain the right to choose. Social media platforms then have no content liability but only in exchange for allowing a true marketplace of access, of filters, and of ideas. Competition among filters addresses the moderation problem. People who want safe spaces can have them. People who want rough and tumble spaces can have them, existing side by side. Competition among filters also addresses the monopoly problem.

What distinguishes common carrier platforms from publishing platforms? Focusing on listener rights—not just those of speakers—again suggests a workable test. What do users consume? If the preponderance of users’ consumption is produced by other users—people or identities they have chosen to follow—the common carrier analogy applies. Think Facebook, X (Twitter), Telegram, and Snapchat. By contrast, if platform editors, algorithms, or anonymous curators produce the preponderance of content consumed, the publisher analogy applies. Think Google Search, Wikipedia, Techdirt, and Yelp. Platforms can choose one or the other by producing more or less original content themselves.

Those who produce filters gain publisher privileges. They have full editorial control. For them, content moderation is essential. Otherwise, spam makes a forum unusable. Harassment makes that forum unlivable. Publishers need the right to edit content, and we need them to exercise that right.

But what if publishers exercise that right irresponsibly, enabling illegal content or, as in the case of 8Chan, promoting violence associated with killings in El Paso, TX, Christchurch, NZ, and Poway, CA?10 What happens when speech is used to commit crime? Platforms claim content moderation at scale is impossible.4 They want freedom from liability, despite their choices, because filtering 500+ million daily messages is hard. Now, the marketplace analogue yields the answer: solve it by seeing it as pollution. Digital filters, like their mechanical forbears, can be held accountable on a flow rate basis. By analogy to factory effluent, we simply take statistical samples. CNN, Fox News, or this journal would be liable for publishing ads recruiting terrorists, so they edit them out. If Facebook or X-Twitter or Telegram were liable above a certain proportion, they too would edit them out. They would not be liable if they had in good faith caught say 90% or 95% or 99.9%. Social media platforms can be held liable, under terms print publishers and media broadcasters already face, but on a flow rate basis rather than for each and every post. A doctor does not check cholesterol by taking all your blood; the doctor takes a statistical sample. This then solves the pollution problem. Filter producers can declare their filtration rates. Users can hold them accountable based on contract or tort—no government is necessary. The marketplace of ideas becomes self-cleaning based on choices of free market participants.

The point is not that social media are common carriers or that they are publishers. At present they are both, and they are neither. Rather, the point is that social media must choose one or the other, not both, and that the courts and Congress must hold them accountable for that choice. Recognizing listener rights, and not just those of speakers, clarifies which analogy applies while making the social media market not just fairer for listeners as well as speakers but also for print and broadcast too.

    References

    • 1. Bensinger, K. On X, Musk shares a manipulated video of Harris. The New York Times (Jul. 27, 2004).
    • 2. Bhagwat, A. Why social media platforms are not common carriers. J. Free Speech Law 122, (2022).
    • 3. Grimmelmann, J. Speech engines. Minn. L. Rev. 868, (2014).
    • 4. Masnick, M. Content moderation at scale is impossible to do well. Techdirt  (Nov. 20, 2019); https://bit.ly/4e28wIM
    • 5. Masnick, M. No, Elon isn’t blocking Kamala from getting followers. TechDirt 24 (July 2024); https://bit.ly/3TqX416.
    • 6. Miller, C.C. Another try by Google to take on Facebook. The New York Times  (Jun. 28, 2011); https://bit.ly/4eqzdGQ
    • 7. Mochizuki, T., Duan, E., and Hasebe, Y. Facebook scams demand stricter online rules, Japan lawmaker says. Bloomberg (Apr. 26, 2024); https://bit.ly/4e4Rh9I
    • 8. Pessoa, G.S. and Ortutay, B. What to know about Elon Musk’s ‘free speech’ feud with a Brazilian judge. AP News (Apr. 11, 2024); https://bit.ly/3zliHca
    • 9. Stargardter, G. and Hummel, T. French authorities charge Telegram’s Durov in probe into organized crime on app. Reuters  (Aug. 28, 2024); https://bit.ly/3MOYZJ0
    • 10. Thompson, B. A framework for content moderation. Stratechery (Aug. 7, 2019); https://bit.ly/3zgUaVN

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More