Home → Magazine Archive → February 2008 (Vol. 51, No. 2) → Software Transparency and Purity → Full Text

Software Transparency and Purity

By Pascal Meunier

Communications of the ACM, Vol. 51 No. 2, Page 104

Save PDF

Many software programs contain unadvertised functions that upset users when they discover them. These functions are not bugs, but rather operations intended by their designers to be hidden from end users. The problem is not newTrojan horses and Easter eggs were among the earliest instancesbut it is increasingly common and a source of many risks. I define software transparency as a condition that all functions of software are disclosed to users. Transparency is necessary for proper risk management. The term "transparency" should be used instead of "fully disclosed" to avoid confusion with the "full disclosure" of vulnerabilities.

There is a higher standard to be named, because disclosure does not by itself remove objectionable functions. They pose risks while being irrelevant to the software's stated purpose and utility, and are foreign to its advertised nature. Freedom from such functions is a property that needs a name: loyalty, nonperfidiousness, fidelity, and purity come to mind, but none of them seems exactly right. For the purposes of this column, I shall call it purity. "Pure Software" can theoretically exist without disclosure, but disclosure would be a strong incentive, as previously discussed by Garfinkel (see www.technologyreview.com/Infotech/13556/?a=f). Purity does not mean free of errors or unchanged since release. It's possible for pure software to contain errors or to be corrupted. The following examples illustrate some of the risks from opaque and impure software.

In 2004, the digital video recording (DVR) equipment maker TiVo was able to tell how many people had paused and rewound to watch Janet Jackson's wardrobe malfunction in the televised Super Bowl. People could opt out of the data collection by making a phone call. The privacy policy, if it was read, did mention some data collection, but did not disclose its full extent and surprising detail. Very few would likely have opted-in to allow this foreign function.

Software purity as a desirable property is highlighted by some of the differences between the GNU Public License (GPL) v2 and v3. The changes can be viewed as intended to protect the capability to remove unwanted functionality from software, including firmware based on GPL code (for example, TiVo).

In 2005, the anti-cheating Warden software that was installed with the World of Warcraft online game was found to snoop inside computers. Some people love knowing it is there, whereas others find it distasteful but are unable to make a convincing argument that it is malicious spyware. Despite being authorized by the End-User License Agreement (EULA), it poses risks that were not made clear, through undisclosed, objectionable behaviors.

Also in 2005, copy prevention software unexpectedly present on Sony BMG CDs was installed surreptitiously when users attempted to play a CD on their computer. It was later recognized as a rootkit. Ironically, it was reused to attack the Warden.

In 2007, people who had paid for Major League Baseball videos from previous years found they were unable to watch them anymore because of a broken Digital Rights Management (DRM) system, because the server providing authorization was decommissioned without warning. Fragile DRM systems, such as those requiring an available server, are undesirable because of the risks they present while being foreign to the advertised features or content.

Also in 2007, Microsoft Live OneCare surreptitiously changed user settings when installed to enable automatic updates and re-enable Windows services that were disabled on purpose; this is documented obscurely. Whereas it was not malicious, it caused many problems to users and system administrators and was vehemently protested. Surreptitious functions pose risks, even if well intentioned.

Software transparency and purity are often valued but not explicitly identified. Beyond the obvious information security risks to users, opaque or impure software also poses business risks in the form of loss of reputation, trust, goodwill, sales, and contracts. It may be that transparency alone is enough for some purposes, and others may also require software purity. An explicit requirement of whichever is appropriate would decrease risks.

Back to Top


Pascal Meunier ([email protected]) is a research scientist at Purdue University. His teaching and research include computer security and information assurance.

Back to Top


DOI: http://doi.acm.org/10.1145/1314215.1314232

©2008 ACM  0001-0782/08/0200  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2008 ACM, Inc.


No entries found