In "The battle over the Institutional Ecosystem in the Digital Environment" (Feb. 2001, p. 84), Yochai Benkler essentially claims that peer-produced information products are at least as valuable as commercially available products. This is not true, as today consumers are overwhelmed by the amount of information freely available on the Internet; much of it is inaccurate, some possibly even harmful. Free Usenet forums are typically of little value, whereas commercial, moderated forums are tightly focused and more useful to participants.
Benkler also argues that property is a hindrance when free information products can be produced. This is also untrue, as consumers through market mechanisms favor the most compatible and useful software packages at the lowest possible cost, including free packages. Producers should not have to give up the rights to their products to support free software.
Peer production of information and software is of value in uniting individuals working toward a common goal and enabling an efficient exchange among them. However, it must not dilute the property rights of commercial producers, so they continue to produce quality information products. Instead, it should serve to augment them.
Ilya Yakovlev
Superior, WI
Yochai Benkler Responds:
Yakovlev makes three points: The first is that amateur production creates a lot of junk, whereas commercial producers create more useful information products. This complaint misses the point. High- and low-quality information is ubiquitous, produced in both commercial and noncommercial, property-based, and commons-based modes. The important insight that free software and the many other high-quality information exchanges on the Net have shoved under our noses is that the best peer-produced information is every bit as good as, if not better than, the best property-based information. This is nothing new. University research has long taught us this fact.
Yakovlev’s problem is that in the world of newly possible peer-produced information, we do not yet have perfectly calibrated mechanisms for identifying relevance and accreditation. Here, too, peer production can be useful. Just compare Google to some of the older search engines. While the latter may sell placement (producing "relevance" in response to the market), Google relies on a peer-production principle by incorporating the number of sites linked to a given site to determine its relevance and quality. Peer review on Slashdot similarly suggests a path for peer review, like peer production of relevance and accreditation. As these mechanisms evolve, they will solve the problem of telling high from low quality. Commercial origin itself does not do the trick, as tabloids at grocery checkout counters amply demonstrate.
Yakovlev’s second complaint is that consumers acting through the market favor the most compatible and useful software packages. The "market will take care of it all" myth rarely works, even in more "normal" markets than the one for high-tech and information goods. It is certainly a leap of faith to make this claim in the context of a market with the quirky economics of tippy markets, network effects, pure public goods, and all-around lack of information and sophistication among many consumers. In any event, my primary complaint in the article was against a variety of government monopoly rights (called "intellectual property") that actually prevent the emergence of free competition between peer production and industrial production—and that would allow us to see what consumers really prefer.
Yakovlev’s final complaint is that I am suggesting peer production requires owners "give up their rights" to protect peer production. However, as my examples in the article suggest, the problem isn’t making owners "give up their rights." The problem is that, faced with competition from nonproprietary production and exchange of information and culture, property-based industrial producers are cranking up the law to prohibit more and more ways of using information efficiently, so as to make nonproprietary production more expensive and difficult to undertake. The new enclosure movement is not about protecting competitive markets, but preventing the emergence of a competitive threat from a new mode of production that fundamentally alters the old ways of producing information.
Hope for Educational Sea Change
I read Elliot Soloway’s column ("Log on Education," Dec. 2000, p. 15) with much interest. I believe the quotation from Carl Sagan was especially pertinent to elementary education: "If we only teach the findings of science without communicating its critical method, how can the average person possibly distinguish science from pseudoscience?"
In my lifetime, there has been a litany of curricula, from "New Math" to "Whole Language," introduced with much hope and fanfare, but subsequently proving ineffective or even counterproductive. How did the proponents of these programs overlook the scientific method? Where was the careful experiment design, elimination of confounding factors, and attention to data collection and analysis? Did anyone (really) determine that the curriculum was an improvement, and, if so, were the findings repeatable by others? Yes, such studies are expensive, but, as the saying goes, so is ignorance.
I agree that John Glenn is an impressive individual with impeccable leadership credentials, and I fervently hope he is able to make a sea change in education from the current "unsupported assertions" (again quoting Sagan) to curriculum decisions based on firm scientific evidence. There are scientifically proven programs out there, particularly for reading. The results are impressive, at least when compared to current common practice: I have seen the difference both in my own children and in other children who have had the good fortune to come in contact with such programs.
If Glenn succeeds in making this sea change, it will be a crowning achievement and a fitting capstone to a distinguished career in public service. Otherwise, I fear this is yet another commission that does nothing, but does it brilliantly.
Paul McKenney
Beaverton, OR
A Consistent Definition of Information Security
Information security need not be defined as incompletely as Bashir, Serafini, and Wall suggest in their introductory article, "Securing Network Software Applications" (Feb. 2001, p. 29). So far the experts in the field have done a poor job defining information security, akin to describing modern chemistry in the alchemy framework of fire, water, earth, and air.
The authors define some dimensions and content of information security by quoting the incorrect, incomplete, inconsistent definitions used by the Internet Society (derived from the U.S. Department of Defense, NIST, ISO, OECD, and other sources). They say information security includes authentication, access control, audit trail, confidentiality, integrity, availability, and nonrepudiation. These descriptors are an inconsistent mix of safeguard objectives (authentication of system entities, access control, audit trail, and nonrepudiation) and the secure information attributes of confidentiality, integrity, and availability (CIA).
To be consistent, either describe information security as a set of safeguard objectives or as a set of secure information properties to be preserved. But a few incorrectly defined and incomplete items of each set should not be thrown in a stew pot. Many other safeguard objectives are just as important as the three the authors mention, including authentication of system users, classification, segregation of duties, intrusion detection, and nondeception. The CIA information attributes are defined incorrectly, and are important security attributes of information missing, namely, authenticity, possession, and utility of information.
The Internet Society framework of information security quoted by the authors incorrectly defines confidentiality as "not being made available or disclosed to unauthorized individuals, entities, or processes." Dictionaries make clear that confidentiality refers only to knowing information, not its availability. And unauthorized observation is just as important as disclosure, but is not mentioned in the definition. For example, it is possible to steal and use information without violating its confidentiality.
Integrity of information is included, but there’s no mention of security that preserves the validity of information. Information may have integrity but not be valid, and may be valid but lack integrity. Integrity means information is whole but not necessarily valid. Likewise, availability of information is included in the list, but its usefulness is omitted. Protecting the utility of information should be included.
Information security may be simply defined as: the preservation of confidentiality and possession, integrity and validity, and availability and utility of information.
Donn B. Parker
Los Altos, CA
Join the Discussion (0)
Become a Member or Sign In to Post a Comment