Computing Profession

Privacy As… Sharing More Information?

Carnegie Mellon Associate Professor Jason Hong

When I first started working in the area of personal privacy I had what I would call a conventional view on privacy, which is how to minimize the flow of information going out about them. I reasoned that people were private individuals, wanted protection from interruptions, and wanted a lot of control and feedback about what was shared.

My first shift in thinking came from Erving Goffman’s book The Presentation of Self in Everyday Life. Goffman was a sociologist looking at all the subtle ways that people presented themselves to others, and how we project different personas to different people in different situations. For example, I project my "professor" persona to students (in part because it is a role I am playing in that aspect of my life), and fluidly shift to a more laid back and casual persona when I’m with my close friends. As such, rather than simply viewing privacy as being manageable through a set of pre-defined rules, we might also view privacy as impression management, making it so that people see you the way you want them to see you.

This perspective of privacy as impression management has been a recent push in the human-computer interaction community, most prominently forwarded by a really great paper on privacy as boundary negotiation by Leysia Palen and Paul Dourish (PDF).

My most recent shift in thinking came from my observations and experiences with online social networking sites like Facebook and Twitter. Specifically, how do we make systems so that we can  share more information with others in a way that is still safe?

While this perspective may sound counter-intuitive at first, I changed my views after seeing potential benefits of sharing more information. If people shared more information (about their activities, location information, and schedule, for example), you can get better coordination with others, a stronger sense of connection, more starters for conversation ("I saw you were in Paris recently, how was it?"), more potential for opportunistic encounters, and easier okayness checking (making sure a loved one got home without having to disturb them, for example). Online social networks really opened up my eyes to the potential of what can happen when more people share more information.

Now, this isn’t to say that there aren’t risks. We’ve all seen reports about people losing jobs due to pictures or content they posted, identity theft, risks in identity-based challenge-response fallback questions, social phishing and social spam attacks, of stalkers and child predators, overly intrusive sharing systems, and outright embarrassment.

What I am saying is that, rather than just viewing privacy as not sharing information with others, or viewing privacy as projecting a desired persona, we should also consider how to make systems so that people can safely share more information and get the associated benefits from doing so. In other words, these three views aren’t mutually exclusive, and provide a useful lens by which we can think about and design systems that share personal information with others.

There are many dimensions here in this design space. We can change what is shared, how it is shared, when something is shared, and who it is shared with. One key challenge here is in balancing privacy, utility, and the overhead for end-users in setting up these policies. Another key challenge is understanding how to help people change these policies over time to adapt to people’s needs. These are issues I’ll discuss in future blog postings.

For me, a particularly intriguing way of thinking here is safe staging, an idea that Alma Whitten brought to the attention of security specialists in her seminal paper Why Johnny Can’t Encrypt. The basic idea is that people progressively get more powerful tools as they become more comfortable with a system, but are kept in a safe state as much as possible as they learn how to use the system. A real-world example here would be training wheels on a bicycle. For systems that provide any level of awareness, the defaults might be set, for example, so that at first, only close friends and family see anything, while over time people can easily share more information as they understand how the system works and how to control things.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More