Industrial-size data spills, leaks about large-scale secret surveillance programs, and personal tragedies due to inappropriate flows of information are guaranteed to have at least one consequence: engineers will be increasingly expected to integrate "privacy solutions" into the systems they are building or maintaining.4 Yet the task of engineering systems to address privacy concerns can be bewilderingly complex.
The seeming unwieldiness of the engineering task becomes evident in the concept of privacy itself and how this concept is negotiated. As a legal concept, privacy is defined rather vaguely. That vagueness, some argue, is part of its protective function. The open-ended definition allows people to invoke privacy as a category to protect their personal lives and autonomy from intrusions by others—including the state that endows them with citizenship rights and runs surveillance programs. European Data Protection Directive (DPD) or Fair Information Practice Principles (FIPPs) on the other hand are procedural measures, such as notice and choice, data retention limitation, and subject access rights. These principles are seen to be instrumental to making the collection and processing activities of organizations transparent. Although less ambiguous, data protection principles still need to be translated into technical requirements and are vulnerable to narrow interpretations. Moreover, FIPPs fall short of mitigating all the privacy concerns of users toward a given organization. They also do not address privacy concerns users may have with respect to other users, with people in their social environments, and toward a greater public.
The following letter was published in the Letters to the Editor in the October 2014 CACM (http://cacm.acm.org/magazines/2014/10/178772).
Just before reading Seda Gurses's Viewpoint "Can You Engineer Privacy?" (Aug. 2014), I had been reading the latest on hacking car control units by manipulating the software controlling the car, especially the engine, the steering wheel, and other car components,(1) pondering the need for a new approach to security and privacy.
Why are intruders so successful? For one thing, computer science and engineering often simplifies attacks, with appliances and application systems using standardized and generalized algorithms, protocols, and component systems. These concepts are also the basis of the software industry's ability to quickly develop new systems that are open for further development. Intruders are likewise able to create tools for unwelcome manipulation.
What new paradigm of computer science would allow software developers to improve system security and personal privacy? How about one that is application-specific, employs nonstandard protocols and address schemes (such as on LANs in cars), and eliminates concepts like algorithms and data structures "reserved for future use," or more general algorithms in applications than are needed ("upward compatibility"), as in a companywide hardware and software platform in car computers? (This is not to say I advocate the idea of handcrafting all future secure systems.)
Rather than make it easy for would-be intruders to develop generalized tools, application engineers should look to develop standardization variation generators, or SVPs, to create strategic complexity specific to families of applications or even to individual appliances. In the case of cars, SVPs must be able to generate a specific protocol for communication between sensors, steering activators, and control processors, even though they are derived from a general class of protocols. Dynamic solutions like protocol variations that depend on car-key identification are especially promising, not by substituting encryption and information hiding but by providing another self-contained obstacle to foil intruders.
Privacy and security can be engineered, even in highly sensitive systems, but such engineering works only if application system architects and software developers view computer security as the predominant architecture, not just as added functionality, on which to develop applications.
Georg E. Schaefer
(1) Pauli, D. Students hack Tesla Model S, make all its doors pop open in motion. The Register (July 21. 2014).
Displaying 1 comment