Opinion
Computing Applications Viewpoint

Information: ‘I’ vs. ‘We’ vs. ‘They’

Seeking a balance between protecting and using personal data.
Posted
  1. Introduction
  2. 'I' vs. 'We'—The Cultural Context
  3. 'They'
  4. The Cultural Context Revisited
  5. Conclusion
  6. References
  7. Author
three-sided figure with sides labeled 'we' and 'they' and 'I' respectively

In his May 2021 Vienna Gödel Lecture, Moshe Vardi made the case for a moral "trilemma."5 Paraphrasing Vardi, it referred to the increasing power of online content, including the power to seriously endanger the very foundations of a society, and the question of who should control that power. Should it be: private industry, as providers of the platforms; the government, representing the public; or nobody at all, following a constitutional imperative of free speech? We are just beginning to find answers to these questions, and different societies are taking different approaches.

Here, I present another "trilemma," which is related in that it is also about information sovereignty, but different in that it concerns what we usually call "personal data." To consider a current example, a bit representing whether a person is COVID-infected or not is certainly of a rather personal nature. But that does not answer yet whose data it is—and who should decide about whose data it is. Is it my data, not to be shared with anybody unless I explicitly give my consent? Or is it our data, because as a society we need this information, at least in some anonymized/aggregated form that is still precise enough to be truly useful, to treat a pandemic effectively? And do they, who effectively decide this, decide that question as I or we would do—if we were asked?

Back to Top

'I' vs. 'We'—The Cultural Context

The answers to the preceding questions may depend on the cultural context. According to a recent panel on how Asian countries have handled the pandemic, there have been effective laws for protecting personal data in Taiwan for a long time, especially in the medical field, but apparently there is a societal agreement that data is systematically aggregated and evaluated.1 In contrast to these countries, it was difficult in Germany to, for example, introduce compulsory COVID tests in schools.

Along similar lines, consider the discussion around the introduction of a national COVID warn app. One horror scenario initially circulated was that the cellphone would sound an alarm as soon as one met an infected person. A horror scenario not because it would indicate imminent personal danger for a contact person, but because the infected would then be robbed of their anonymity. Of course, the Corona app was designed differently from the start, namely in such a way that if someone tests positive, others that were in contact with that person in the past are informed. One should not even meet infected people, because they should quarantine anyway.

Back to Top

'They'

Quoting from a (translated) article from May 25, 2021, in the online education magazine News4Teachers3: "The data protection officer of Baden-Württemberg warns against using Microsoft products in schools—for fundamental reasons. If the state government follows his line, all software from U.S. corporations would have to be banned from German educational institutions. However, there is now resistance to such IT fundamentalism: school administrators see the work of their colleges as being threatened. A petition on the Net calls for more pragmatism in data protection."


Assessments concerning privacy can hardly be completely neutral and may underlie some cultural bias.


A subsequent survey conducted on that case revealed that among the 5,000+ participants, approximately 75% were in favor of keeping the MS products, and even 82% among the 3,000+ teachers. It remains to be seen whether "our" fairly clear public will can trump "their," the data protection officials' objections.

Moving to university life, at the end of the last introductory programming class, my team and the students got together for post-exam socializing. At a late hour, reflecting on life in general and social interactions during the last semester in particular, students noted the discrepancy between the "official" infrastructure—an open source-based, self-hosted system lauded for its privacy protection (Mattermost)—and where the students "really hung out" (Discord). I found this rather regrettable, also since "they," in this case our department jointly with the student council, had invested a considerable effort and tax money to provide Mattermost. If we were to do it over again, I would prefer to start with a survey of the actual users.

To be clear, I agree with data protection as a fundamental right. I vividly recall how in Germany, "we" fought in the early 1980s for this, driven by concerns the census planned by "them" would lead to a "transparent citizen." The massive public protests ultimately led to the ruling by the Federal Constitutional Court of December 1983, the so-called "census ruling" establishing "the right to informational self-determination." The then-modified census, ultimately conducted in 1987, is in public memory due to that court ruling. A lesser-known fact is the census led to policy decisions such as housing programs, after the census revealed approximately one million fewer apartments existed in Germany than previously assumed.

Like other societal achievements—for example, democracy—the right to informational self-determination is not achieved once and for all, but requires some continuous effort to stay in place. This effort requires some education about the issues, perhaps also of the aforementioned teachers who widely reject the data protection officer's verdict. However, as the protesting teachers remind us (or "them"), the fundamental right to informational self-determination does not mean other fundamental rights, such as proper education, disappear. Simply claiming "they" know better than the ones on duty on the ground can only lead to frustration—an honest debate on what the issues and alternatives are is required.

Back to Top

The Cultural Context Revisited

The German Wikipedia page on Discord states (translated): "Discord's data protection regulations grant the company the most comprehensive rights to transmit all chats, messages, and other data unencrypted, to collect them and to process them into data for sale. By agreeing to the terms of use, Discord is expressly allowed to track and save information from direct messages or sent images and voice chats. In addition, all data can be transferred to American servers. Discord therefore does not guarantee the protection of personal data prescribed in the GDPR and is therefore not GDPR-compliant. Although Discord is also free in the commercial environment, it can therefore hardly be used there within the E.U."

Admittedly, when reading this, one can get a bit queasy, especially as a lecturer that must decide which platform to offer officially. That the data in question should be no more sensitive than chat inquiries asking for assistance on a particular assignment, and that most of the students are on Discord anyway, does not matter much. Thus, I fully understand the colleague who was enthusiastic initially about Discord after seeing a short demo and wanted to use it, but refrained from doing so after reading this Wikipedia text.

For the introductory programming class, we decided to offer both Mattermost and Discord and to let the students "vote with their feet." We also put in a link to the Wikipedia text on Discord and cautioned the students to think about what they should post on Discord and what not, not only with regard to our course. The vote was clear: inquiries on Mattermost started as a trickle and died completely about a month after the semester had started; Discord, on the other hand, was buzzing from start to finish. In the evaluations at the end of the semester, Discord was named first among the positive aspects. And concerning privacy, I actually consider the situation at our Mattermost, where every user participates with their real name and every user can retrieve the complete list of participants with names and email messages at the push of a button, more privacy-sensitive than the situation in Discord, where participants appear with self-chosen pseudonyms and I usually do not even know who I am communicating with. In fact, that privacy may contribute to the ease of asking also "dumb" questions, which I consider a good thing in particular for an introductory class.


Admittedly, when reading this, one can get a bit queasy.


Back to Wikipedia. On the English-language page on Discord, there are several pages under "Controversies" on abuse in chats, use by alt-right (which, ironically, is justified by the "pseudonymity and privacy offered by discord"), pornographic content, and so forth. There is not a word about the data protection-related points criticized on the German page.

Clearly, this is rather anecdotal evidence, and privacy awareness may be rising in English-speaking countries as well. The bottom line, however, is that assessments concerning privacy can hardly be completely neutral and may underlie some cultural bias as well. One is not spared from having to think and deliberate oneself, preferably based on sources that try to dig to the bottom of things, and typically there is a continuum of options to be considered. Personally, I found for example the online article by Stephan Hansen-Oest, a lawyer specializing in data protection, to be a useful guide on the legal aspects of using Zoom in the classroom (continually updated, 11,000 words and counting; while it is only available in German, an automatic translation is fairly accurate).2 Tellingly, his verdict is not a clear yes/no, but rather an "it depends."

Back to Top

Conclusion

Returning to the questions posed at the beginning of this Viewpoint, I argue it is not (only) my own business whether I am infected or not, but that we as a society should be able to use this data as effectively as possible. And just as it is generally considered good citizenship to pay taxes, give to charities, and donate organs, one should consider opportunities to share some personal data for the common cause as well. A laudable step in that direction is that the aforementioned Corona app now includes an option "data donation" to do just that.

Clearly, privacy is important, and per default, neither the state, our neighbors, nor big tech should know more about us than what we willingly provide. Our right to informational self-determination, hard-fought for in the past, should not be compromised too easily. But if any decision involving data is driven solely by potential privacy threats, and other concerns do not even enter the picture, we do have a problem. Ultimately, this problem comes back to us individuals, that is, affecting our personal health. As illustrated by a few examples listed in this Viewpoint, it may be instructive to look across societies and to consider different approaches.

Now, I do hold the optimistic conviction that in most circumstances, with a bit of ingenuity provided by our profession, there are actually ways to protect and use our personal data at the same time; consider, for example, recent developments in Privacy Preserving Analytics.4 That is, we can protect our individual integrity and anonymity, but still contribute to aggregate knowledge needed, for example, to find out how many infections took place last week in indoor-dining locations in Kiel. But it is my impression that, sadly, we often do not even get to think about solutions for that kind of problem because its raw input consists of personal data, and the discussion stops before it even begins.

Thus, my appeal to representatives and decision makers: think about what the interests really are that you should represent—and perhaps even ask the groups involved. Protecting data and losing sight of other objectives is likely to lead to suboptimal results, also concerning public support for the cause. Sometimes be brave to do what you consider to be the right thing to do, not always what political or social lines of thought appear to dictate. And of course, when developing applications involving personal data, invest appropriate effort to protect the data and to communicate clearly to decision makers and users how they are protected.

In the end, and here I agree with Vardi's lecture again, it is about trust. Do we trust each other and those who are in charge to handle our data responsibly? An unconditional "yes" requires a lot of faith, and I would certainly agree we better put in some checks and balances. But an unconditional "no," as appears to be the current answer, seems like a dead end in the long run.

    1. Damm, J., Habich-Sobiegalla, S., and Lee, C-y. China's and Taiwan's Pandemic Response Compared: The European Perspective, Roundtable Discussion, Kiel University China Center (Mar. 24, 2021).

    2. Hansen-Lest, S. Hilfe… ist ,,Zoom" etwa eine Datenschleuder?; https://bit.ly/3qyIcz8

    3. News4Teachers: Streit um Microsoft und Co: Datenschutzbeauftragter stellt US-Software für Schulen infrage—Petition fordert Pragmatismus. (May 25, 2021); https://bit.ly/3NjkDEb

    4. Sharma, S., Chen, K., and Sheth, A. Toward practical privacy-preserving analytics for IoT and cloud-based healthcare systems. IEEE Internet Computing 22, 2 (Mar./Apr. 2018).

    5. Vardi, M.Y. Technology Is Driving the Future, But Who Is Steering? 8th Vienna Gödel Lecture (May 27, 2021); https://bit.ly/3KKJkYe

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More