Opinion
Computing Applications Inside risks

Software Transparency and Purity

Posted
  1. Article
  2. Author
  3. Footnotes

Many software programs contain unadvertised functions that upset users when they discover them. These functions are not bugs, but rather operations intended by their designers to be hidden from end users. The problem is not new—Trojan horses and Easter eggs were among the earliest instances—but it is increasingly common and a source of many risks. I define software transparency as a condition that all functions of software are disclosed to users. Transparency is necessary for proper risk management. The term "transparency" should be used instead of "fully disclosed" to avoid confusion with the "full disclosure" of vulnerabilities.

There is a higher standard to be named, because disclosure does not by itself remove objectionable functions. They pose risks while being irrelevant to the software’s stated purpose and utility, and are foreign to its advertised nature. Freedom from such functions is a property that needs a name: loyalty, nonperfidiousness, fidelity, and purity come to mind, but none of them seems exactly right. For the purposes of this column, I shall call it purity. "Pure Software" can theoretically exist without disclosure, but disclosure would be a strong incentive, as previously discussed by Garfinkel (see www.technologyreview.com/Infotech/13556/?a=f). Purity does not mean free of errors or unchanged since release. It’s possible for pure software to contain errors or to be corrupted. The following examples illustrate some of the risks from opaque and impure software.

In 2004, the digital video recording (DVR) equipment maker TiVo was able to tell how many people had paused and rewound to watch Janet Jackson’s wardrobe malfunction in the televised Super Bowl. People could opt out of the data collection by making a phone call. The privacy policy, if it was read, did mention some data collection, but did not disclose its full extent and surprising detail. Very few would likely have opted-in to allow this foreign function.

Software purity as a desirable property is highlighted by some of the differences between the GNU Public License (GPL) v2 and v3. The changes can be viewed as intended to protect the capability to remove unwanted functionality from software, including firmware based on GPL code (for example, TiVo).

In 2005, the anti-cheating Warden software that was installed with the World of Warcraft online game was found to snoop inside computers. Some people love knowing it is there, whereas others find it distasteful but are unable to make a convincing argument that it is malicious spyware. Despite being authorized by the End-User License Agreement (EULA), it poses risks that were not made clear, through undisclosed, objectionable behaviors.

Also in 2005, copy prevention software unexpectedly present on Sony BMG CDs was installed surreptitiously when users attempted to play a CD on their computer. It was later recognized as a rootkit. Ironically, it was reused to attack the Warden.

In 2007, people who had paid for Major League Baseball videos from previous years found they were unable to watch them anymore because of a broken Digital Rights Management (DRM) system, because the server providing authorization was decommissioned without warning. Fragile DRM systems, such as those requiring an available server, are undesirable because of the risks they present while being foreign to the advertised features or content.

Also in 2007, Microsoft Live OneCare surreptitiously changed user settings when installed to enable automatic updates and re-enable Windows services that were disabled on purpose; this is documented obscurely. Whereas it was not malicious, it caused many problems to users and system administrators and was vehemently protested. Surreptitious functions pose risks, even if well intentioned.

Software transparency and purity are often valued but not explicitly identified. Beyond the obvious information security risks to users, opaque or impure software also poses business risks in the form of loss of reputation, trust, goodwill, sales, and contracts. It may be that transparency alone is enough for some purposes, and others may also require software purity. An explicit requirement of whichever is appropriate would decrease risks.

Back to Top

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More