Sign In

Communications of the ACM

Departments

The Sand-Heap Paradox of Privacy and Influence


View as: Print Mobile App ACM Digital Library Full Text (PDF) In the Digital Edition Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Former CACM Editor-in-Chief Moshe Y. Vardi

I look in the mirror every day. I look the same as the day before. No change. Then I have a Zoom call with a person I have not seen in many years, I wonder how they got so old, and I realize that the other person must be thinking the same! The human mind always struggled with comprehending the cumulative impact of a large number of very small changes. This phenomenon has already been the subject of two classical Greek paradoxes: The Sand-Heap paradox and Zeno's paradoxes. When I was an elementary-school pupil, a favorite brain-twister was "How much is infinity times zero?"

We are facing the same paradox with respect to privacy and influence on the Internet. There are information items that we clearly want to protect, such as credit-card numbers. When such sensitive information is stolen via a cybersecurity breach, we clearly feel our privacy has been violated. But it is harder to feel a loss of privacy when we reveal a tiny bit of information at a time: a link clicked or a social-media posting "liked." Yet Internet companies have mastered the art of harvesting the grains of information we share with them, knowingly or unknowingly, and using them to construct sand heaps of information about us. Shoshana Zuboff, of Harvard University, named this business model of Internet companies "Surveillance Capitalism" in a 2019 book.

Zuboff called surveillance capitalism "an assault on human autonomy" and "a threat to freedom and democracy." We all realize that Internet companies persuaded us to give up some privacy for the sake of convenience, but how much privacy have we given away? This is opaque to us. We see each grain of information given away, but not the heap of information. It is also opaque to us how this heap of information has been used by others not only to predict our behavior but also to influence and modify it. After the January 6, 2021 Capitol Insurrection in Washington, D.C., Zuboff wrote that "We can have democracy, or we can have a surveillance society, but we cannot have both."

The core issue, I believe, is that of human agency. Enlightenment thinkers downplayed divine authority and emphasized human agency. Rousseau wrote that "in the depths of my heart, traced by nature in characters which nothing can efface. I need only consult myself with regard to what I wish to do." Of course, we all know that the poet John Donne was right when he wrote "No man is an island entire of itself; every man is a piece of the continent, a part of the main." Our decisions and actions are clearly influenced by the social context. Yet, unless we feel coerced, we do not feel a loss of agency due to such social context.

Advertising, which originated in antiquity but emerged as a major commercial activity in the 19th century, expanded our social context, yet we still felt in control. After all, you can always go to the bathroom during a television commercial. Subliminal advertising, invented in the 1950s, uses sensory stimuli below an individual's threshold for conscious perception. While there is some controversy about its effectiveness, most people find subliminal advertising offensive, because it robs us from our sense of agency: we are being influenced without our awareness. Indeed, many countries ban subliminal advertising.

The Internet has become a subliminal influence machinery. The days where the results of a Google search are ranked by the page algorithm are long gone. Google search results are now customized for each user individually by an opaque algorithm. The argument in favor of such customization is that it is aimed at maximizing user benefit, but it could also be aimed at maximizing advertising revenues. Analogously, the stream of posting on a Facebook user's wall is algorithmically customized, with the goal of "maximizing user engagement." Just like the grains of information we reveal about ourselves result in a heap of information about us, the grains of information that Internet companies give us result in a heap of influence we are not aware of.

Marc Rotenberg raised privacy concerns in a U.S. Senate testimonya in 2000, but no action was taken then by the U.S. on Internet privacy. In 2018, Arnold Kling wrote a famous blog article,b "How the Internet turned bad." The loss of privacy is at the core of that. It is time for us, as a community, to ask now: "How do we turn the Internet good?"

Follow me on Facebook and Twitter.

Back to Top

Author

Moshe Y. Vardi (vardi@cs.rice.edu) is University Professor and the Karen Ostrum George Distinguished Service Professor in Computational Engineering at Rice University, Houston, TX, USA. He is the former Editor-in-Chief of Communications.

Back to Top

Footnotes

a. https://epic.org/privacy/internet/senate-testimony.html

b. https://hackernoon.com/how-the-internet-turned-bad-bf348cdb99e7


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.


 

No entries found