The Internet offers vast new opportunities to interact with total strangers. These interactions can be fun, informative, even profitable. But they also involve risk. Is the advice of a self-proclaimed expert at expertcentral.com reliable? Will an unknown dot-com site or eBay seller ship items promptly with appropriate packaging? Will the product be the same one described online?
Prior to the Internet, such questions were answered, in part, through personal and corporate reputations. Vendors provided references, Better Business Bureaus tallied complaints, and past personal experience and person-to-person gossip told you on whom you could rely and on whom you could not. Participants’ standing in their communities, including their roles in church and civic organizations, served as a valuable hostage.
Internet services operate on a vastly larger scale than Main Street and permit virtually anonymous interactions. Nevertheless, reputations still play a major role. Systems are emerging that respect anonymity and operate on the Internet’s scale. A reputation system collects, distributes, and aggregates feedback about participants’ past behavior. Though few producers or consumers of the ratings know one another, these systems help people decide whom to trust, encourage trustworthy behavior, and deter participation by those who are unskilled or dishonest.
For example, consider eBay, the largest person-to-person online auction site, with more than four million auctions active at a time: it provides limited insurance, and buyers and sellers both accept significant risks. There are problematic transactions to be sure. Nevertheless, the overall rate of successful transactions remains astonishingly high for a market as "ripe with the possibility of large-scale fraud and deceit" as eBay [5].
The high rate of successful transactions is attributerd by eBay to its reputation system, called the Feedback Forum. After a transaction is complete, the buyer and seller have the opportunity to rate each other (1, 0, or −1) and leave comments (such as "good transaction," "nice person to do business with," "would highly recommend"). Participants have running totals of feedback points attached (visibly) to their screen names, which might be pseudonyms. Yahoo! Auction, Amazon, and other auction sites feature reputation systems like eBay’s, with variations, including a rating scale of 1−5, several measures (such as friendliness, prompt response, quality product), and averaging instead of total feedback score.
Auction sites enable trash to be shuttled across the country and in the process transmuted into treasures.
Reputation systems have also spread beyond auction sites. For example, Bizrate.com rates registered retailers by asking consumers to complete a survey form after each purchase. So-called "expert sites" (www.expertcentral.com and www.askme.com) provide Q&A forums in which self-proclaimed experts provide answers for questions posted by other users in exchange for reputation points and comments. Product review sites (such as www.epinions.com) offer rating services for product reviewers (the better the review, the more points the reviewer receives). iExchange.com tallies and displays reputations for stock market analysts based on the performance of their picks.
Why are these explicit reputation systems so important for fostering trust among strangers? To answer, it helps to first examine how trust builds naturally in long-term relationships. First, when people interact with one another over time, the history of past interactions informs them about their abilities and dispositions. Second, the expectation of reciprocity or retaliation in future interactions creates an incentive for good behavior. (Political scientist Robert Axelrod calls this the "shadow of the future" [2].) An expectation that people will consider one another’s pasts in future interactions constrains behavior in the present.
Among strangers, trust is understandably much more difficult to build. Strangers lack known past histories or the prospect of future interaction, and they are not subject to a network of informed individuals who would punish bad and reward good behavior. In some sense, a stranger’s good name is not at stake. Given these factors, the temptation to "hit and run" outweighs the incentive to cooperate, since the future casts no shadow.
Reputation systems seek to establish the shadow of the future to each transaction by creating an expectation that other people will look back on it. The connections among such people may be significantly weaker than in transactions on a town’s Main Street, but their numbers are vast in comparison. At eBay, for example, a stream of buyers interacts with the same seller. They may never buy an item from the seller again, but by sharing their opinions about the seller via the Feedback Forum, they construct a meaningful history of the seller. Future buyers, lacking personal histories with particular sellers, may still base their buying decisions on a sufficiently extensive public history. If buyers do behave this way, the sellers’ reputations will affect their future sales. Hence, they seek to accumulate as many positive points and comments as possible and avoid negative feedback. Through the mediation of a reputation system—assuming buyers provide and rely on feedback—isolated interactions take on attributes of long-term relationships. In terms of building trust, a boost in the quantity of information compensates for a significant reduction in its quality.
For people trying to sell off, say, their old LP-record collections, reputation systems might seem like a nuisance. But consider such an effort in a market with no such system, and hence no obvious distinction between sellers in terms of, say, quality of goods and shipping service. Buyers would be reluctant to pay full prices given their uncertainty about the sellers’ quality (such as whether they reveal scratches in the records at the time of sale). However, high-quality sellers would be reluctant to accept discounted prices. Over time, high-quality sellers would desert the market. Eventually, only the lowest-quality sellers would remain, a dynamic economist George Akerlof memorialized as the "market for lemons" [1].
Reputation systems can reverse this flow and "unsqueeze" a bitter lemon. With clear reputation markers, low-quality sellers get lower prices, leaving a healthier market with a variety of prices and quality of service. For example, sellers with stellar reputations may enjoy a premium on their services; some users may be willing to pay for the security and comfort of high-quality services. Such premiums are observed in auctions in auctions of coins and computer chips on eBay [3, 6, 7]. The benefits of informative reputation systems return to buyers and to sellers, enabling the old LPs to spin out the door.
Ratings are not the only way to convey reputations. When agreeing to be rated is optional (such as when registering as a retailer at bizrate.com), doing so is likely an indication of higher-quality services, even before ratings are available. Other ways to indicate quality are to use one’s real name, rather than a pseudonym, and to indicate on a Web site that one also has a physical store with its attendant overhead costs.
To operate effectively, reputation systems require at least three properties:
- Long-lived entities that inspire an expectation of future interaction;
- Capture and distribution of feedback about current interactions (such information must be visible in the future); and
- Use of feedback to guide trust decisions.
In the offline world, capturing and distributing feedback is costly. Businesses often collect feedback from consumers but tend not to publicize the complaints. A few independent services, such as Zagat’s for restaurants and Consumer Reports magazine for appliance repair histories, systematically capture and disseminate feedback. For the most part, however, reputations travel haphazardly through word of mouth, rumor, or the mass media.
The Internet can vastly accelerate and add structure to the process of capturing and distributing information. To post feedback, users need only fill out an online form; a mere mouse click is often enough. Where interactions are mediated electronically, objective information about performance may be captured automatically (such as delay from question to response at an expertise site). The same technology facilitating market-style interaction among strangers also facilitates the sharing of reputations that maintain trust.
Despite this promise, significant challenges remain in the operating phases of such systems: eliciting, distributing, and aggregating feedback.
Eliciting feedback encounters three related problems. The first is that people may not bother to provide feedback at all. For example, when a trade is completed at eBay, there is little incentive to spend another few minutes filling out a form. That many people do so is a testament to their community spirit, or perhaps their gratitude or desire to exact revenge. People could be paid for providing feedback, but more refined schemes, such as paying on the basis of concurrence with future evaluations by others, would be required to assure that their evaluations are thorough.
Second, it is especially difficult to elicit negative feedback. For example, at eBay, it is common practice to negotiate first before resorting to negative feedback. Therefore, only really bad performance is reported. Even then, fear of retaliatory negative feedback or simply a desire to avoid further unpleasant interactions may keep a dissatisfied buyer quiet. In the end, because information about patterns of moderate discontent may remain invisible, buyers cannot shun the sellers who foster such discontent.
Third is the difficulty of ensuring honest reports. One party could blackmail another, threatening to post negative feedback unrelated to actual performance. At the other extreme, in order to accumulate positive feedback, a group of sellers might collaborate and rate one another positively, artificially inflating their individual reputations.
Distributing feedback, the second phase, poses its own challenges. One is name changes. At many sites, people choose pseudonyms when registering. If they register again, they might choose another pseudonym, effectively erasing prior feedback. Reputations can still have effects, since newcomers want to accrue positive feedback, and those with established reputations want to avoid negative feedback. Game-theory analysis demonstrates that there are inherent limitations to the effectiveness of reputation systems when participants are allowed to start over with new names [4]. In particular, newcomers (those with no feedback) should always be distrusted until they have somehow paid their dues, either through an entry fee or by accepting more risk or worse prices while developing their reputations. Another alternative is to prevent name changes—either by using real names or preventing people from acquiring multiple pseudonyms, a technique called "once-in-a-lifetime pseudonyms" [4].
A second difficulty in distributing feedback stems from the lack of portability from system to system. Amazon.com initially allowed users to import their ratings from eBay. But when eBay protested vigorously, claiming its user ratings were proprietary, Amazon discontinued its rating-import service. Limited distribution of feedback limits its effectiveness; the future casts a shadow on only a single online arena, not on many. Efforts are under way to construct a more universal framework. For example, virtualfeedback.com provides a rating service for users across different systems, but it has yet to gain wide public acceptance.
Finally, there is also potential difficulty in aggregating and displaying feedback, so it is useful in influencing future decisions about whom to trust. Net feedback (positives minus negatives) is displayed at eBay; other sites, including Amazon.com, display an average. These simple numerical ratings fail to convey important subtleties of online interactions; for example, Did the feedback come from low-value transactions? What were the reputations of the people providing the feedback?
As a solution to the ubiquitous problem of trust in new short-term relationships on the Internet, reputation systems have immediate appeal; the participants themselves create a safe community. Unfortunately, these systems face complex challenges, many of which yield no easy solutions. Efforts are under way to address these problems; for example, the Reputations Research Network (see databases.si.umich.edu/reputations) represents a first step toward recognizing reputation systems as a subject of study and as a vital asset for the safety of online interaction environments.
Internet-based reputation systems, like traditional markets, aggregate vast amounts of information, which then significantly influences choices made by businesses, as well as by individuals. The parallel may end there. The theoretical underpinnings of the effective operation of markets are well understood, and the aggregation to a brief set of statistics, namely a single price for each item, proceeds automatically.
Today’s reputation systems, by contrast, shouldn’t work in theory. Individuals shouldn’t be expected to make the effort to provide evaluations; negative evaluations should be avoided completely; and vendors should be expected to develop sophisticated ways to manipulate and trick the system. Even if all reporting were complete and honest, users would find it virtually impossible to utilize the torrents of information they receive on other participants, given the lack of satisfactory summary statistics.
Despite their theoretical and practical difficulties, it is reassuring that reputation systems appear to perform reasonably well. Systems that rely on the participation of large numbers of individuals accumulate trust simply by operating effectively over time. Already, Internet-based reputation systems perform commercial alchemy. On auction sites, for example, they enable trash to be shuttled across the country and in the process transmuted into treasures.
We conclude with an allusion to democracy, another theoretically flawed and practically challenged system that nonetheless appears to perform miracles. Were Winston Churchill, the World War II-era British prime minister, to comment on reputation systems and building trust as he did on democracy and government, he might say: "Reputation systems are the worst way of building trust on the Internet, except for all those other ways that have been tried from time-to-time."
Join the Discussion (0)
Become a Member or Sign In to Post a Comment