Research and Advances
Computing Applications Contributed articles

Regulating the Information Gatekeepers

Concerns about biased manipulation of search results may require intervention involving government regulation.
Posted
  1. Introduction
  2. Rationale for Regulation
  3. Search and Its Stakeholders
  4. Conclusion
  5. References
  6. Authors
  7. Footnotes
  8. Figures
  9. Tables
  10. Sidebar: key insights
world in eyeball

In 2003, 2bigfeet, an Internet business specializing in the sale of oversize shoes ranked among the top results in Google searches for its products. Its prime location on the virtual equivalent of New York’s high-end shopping mecca Fifth Avenue brought a steady stream of clicks and revenue. But success was fleeting: That November, Google’s engineers modified their search engine’s algorithms, an update later dubbed “Florida” by the search-engine community. 2bigfeet’s rankings dropped abruptly just before the Christmas selling season, and this Internet success story was suddenly on the brink of bankruptcy.2

Search engines have established themselves as critical gatekeepers of information. However, despite an increasingly monopolistic Internet search market, they and the implicit filtering process in their rankings remain largely beyond public scrutiny and control. This has inspired us to explore an increasingly topical question: Should search-engine ranking be regulated?

Search engines generally work by indexing the Web through so-called crawler programs. When a user types in a request, search algorithms determine the most relevant results in the index. Although the precise workings of these algorithms are kept at least as secret as Coca-Cola’s formula they are usually based on two main functions: keyword analysis (for evaluating pages along such dimensions as frequency of specific words) and link analysis (based on the number of times a page is linked to from other sites and the rank of these other sites) (see Figure 1).

Appreciating the value of top rankings, Webmasters have learned to optimize their pages so big search engines rank them more highly. This has spawned a worldwide industry of search-engine-optimization consultants, whose techniques are grouped into two categories: white-hat, ensuring that search engines easily analyze a site and are accepted by search engines; and black-hat, including hidden text, as in white text on a white background, considered illicit by most search engines and upon discovery generally punished with degraded ranking.

Search engines clearly have a legitimate interest in fighting inappropriate third-party optimization techniques to ensure their search-result quality; for instance, sites with no other purpose than linking to specific sites to increase page rank (link farms) are black-hat and must be dealt with accordingly, though punishment can be problematic for multiple reasons:

First, sudden ranking demotion and resulting diminished inflow of visitors have major effects on businesses, as illustrated by the cases of Skyfacet (which reportedly lost $500,000 of revenue in 2006) and MySolitaire (which reportedly lost $250,000 the same year14). Only a few cases, including the companies SearchKing18 and Kinderstart11 involving lawsuits over page rankings and German car manufacturer BMW, received notable media attention. Many more cases of dropped ranking have been condemned to virtual silence, among them search-engine optimizer Bigmouthmedia.

Second, though market-leader Google has published guidelines on creating “Google-friendly” pages, the line between permitted and illicit practices is blurry at best.20 For example, Google’s guidelines rightly warn against cloaking, “the practice of presenting different content […] to users and search engines.”13 However, to a certain extent cloaking can be justified and used with good intent by major sites without penalty.8 For instance, the Wall Street Journal uses it to show full versions of pay-per-view articles to Google’s indexing program.8

The difficulty of straddling the line between permitted and illicit practices is further illustrated by a case involving paid links: In February 2009 Google punished its subsidiary Google Japan through a page rank demotion for paying for online reviews of a new widget.23 While Google’s attempt to play by its own rules is positive the case highlights the difficulty of distinguishing permitted from illicit optimization techniques. A leading U.S. commentator in online search asked: “If Google itself […] found itself in this situation, how are ordinary Web sites to be expected to know the ‘rules’ about what they can or cannot do?”23

Third, our research supports the idea that there is no established process of announcement or appeal prior to rank demotion. Companies affected usually realize their fate only through a sudden loss of traffic or revenue. In a personal interview [2008], the CEO of an educational company told us: “The office called me and told me […] that revenue was down […], so I checked our logs and our history […] It was all on one day. We were up to 14 million pageviews per month, and on one day it dropped 70% and [stayed there], and that was it.”

Fourth, options are limited for companies affected by ranking demotion. One interviewee recalls his company got no response, even though he personally went to the search engine firm’s headquarters for assistance.

Fifth, several allegations in the blogosphere claim large players are treated better than their less-powerful counterparts. For example, in May 2008, Hewlett-Packard began offering free blog templates, including hidden links to its own pages, a quick way to gather “high-quality” links and clicks.25 However, there was no evidence of punishment by major search engines, sparking significant controversy in the community.25

Finally, search engines have punished Web sites using search-engine optimization, as well as the search-engine-optimization companies themselves. In the case of SearchKing, a search-engine consulting company in Oklahoma, U.S. courts have found that Google “knowingly and intentionally” dropped the company’s Web sites in its rankings to punish what it deemed illicit ranking manipulation that SearchKing had carried out for its clients.18

Back to Top

Rationale for Regulation

Several researchers have pointed to the dangers of targeted manipulation, arguing it undermines values like free speech, fairness, economic efficiency, and autonomy, as well as the institution of democracy. Concerning democracy and free speech, Introna and Nissenbaum17 argued a decade ago that search engines’ broad structural bias can lead to underrepresentation of niches and minority interests and a loss of variety. Drawing on Anderson’s theory of ethic limitations to markets,1 they made a solid case that this lack of pluralism does not correspond to society’s “liberal commitments to freedom, autonomy, and welfare.”1 Introna and Nissenbaum viewed the Internet as a “political good” due to its role as “conveyor of information” and function like “traditional public spaces” as a “forum for political deliberation.”17 Consequently, just as schools and national heritage are not left to the mercy of free markets, they argued the Internet is a public good requiring special protection.17 Similarly, Bracha and Pasquale3 made the case in 2007 that targeted manipulation of search engine results “threatens the openness and diversity of the Internet as a system of public expression.”3

Another approach based on democratic values emphasizes the right of free speech. Chandler5 argued it protects not only the ability to listen and speak, but that intermediaries cannot impose “discriminatory filters that the listener would not otherwise have used.”5 Since extraneous bias introduces different discrimination criteria, free speech is undermined by targeted search-engine-result manipulation.5

Fairness might also be undermined. Since search-engine rankings have enormous influence on business performance, ranking manipulation can cause significant harm both arbitrarily and more deliberately. Search engines as private entities are generally free to conduct business as they wish within the limitations applicable to all companies. However, Webmasters cannot simply opt out of their dependence on search engines, perhaps representing an “inescapable influence,”10 taking their relationship from the private to the public sphere where more dependable accountability is expected.10


Companies affected usually notice their fate only through a sudden loss of traffic or revenue.


Other concerns involve economic efficiency, deception, and autonomy. Targeted manipulation can limit the availability of information, causing market inefficiencies and barriers to entry.3 Manipulating search results, while leading users “to believe that search results are based on relevancy alone,”16 could be deceptive, while search engines shaping user options and information could limit user autonomy.3

Regulation and alternatives. Traditional justification for regulation includes control of monopoly power and excess profits, compensation for externalities, inadequate information, unequal bargaining power, and scarcity of essential products.4 While none perfectly fits the case of targeted results manipulation, intervention could be supported through two rationales:

Information asymmetries. Elkin-Koren and Salzberger7 pointed out that the Internet’s common market failure is an overload rather than undersupply of information.7 However, the solution to this problem—the Internet search market—suffers from a lack of information. Just as laymen cannot easily assess the services of doctors or the effects of a particular medicine,4 users of search engines are unable to adequately evaluate search services.

Market power. While the Internet search market shows monopolistic tendencies (extremely strong market positions of a few key players), there is no strong case for regulatory intervention on the standard antitrust argument of abuse of monopoly power (such as lack of competition leading to excessive pricing). However, a case can be based on Breyer’s argument4 of an “unjustifiably discriminatory exercise of personal power” combined with the “concentration of substantial social and political power” in a private entity that controls an “essential product”4 (see also Bracha and Pasquale3 for a similar view on the application of essential-facility arguments to search engines). These arguments were developed in the early 20th century U.S. business environment when a group of railway companies in control of access to the city of St. Louis prevented competitors from offering services in the same area.21 Even if these arguments do not apply to users of search engines, they may apply to Webmasters unable to choose the search engines the public uses to find information.

Regulation must be compared to other solutions, particularly free markets. The basic market argument is that search engines have an incentive to produce the most relevant search results; otherwise, users would switch to their competitors.17 However, markets alone are unlikely to address the concerns of targeted manipulation for three reasons:

Proprietary algorithms. Users would not be able to detect targeted manipulation in most cases, as search engines keep their algorithms secret.3 Moreover, even if users are more aware of results manipulation, their expectation of what they are searching for is shaped during the search process.17 Consequently, they cannot objectively evaluate search-engine quality17;

Concentrated market. While the Internet search market is highly concentrated, a less monopolistic market is unlikely to emerge due to high economies of scale.3 Furthermore, incumbents benefit from their existing user base by, say, collecting user data through such products as the Google toolbar. Moreover, the emergence of new big players is also unlikely, since promising start-ups could be acquired by such dominant search giants as Google and Microsoft; and

User inertia. While switching might seem easy (users simply type another address), evidence suggests that personal habit is a key factor when selecting a search engine22; moreover, new technologies like personalized search are likely to raise switching costs significantly.

Technological development is also often mentioned in the context of search-engine bias.12 However, while new technologies may alleviate some concern (such as reinforcement of popular sites), no technology in sight is likely to cure the problem of targeted manipulation.

Two counterarguments often brought up against regulation of search-engine bias are that search results are free speech and therefore cannot be regulated, and search engines are not essential facilities, as they do not fulfil the criteria of essential facilities accepted by U.S. courts.


Markets alone are unlikely to be sufficient to address the concerns about targeted manipulation.


While these arguments have merit, they are insufficient for rejecting intervention for several reasons: Though the courts have acknowledged First Amendment rights for search engines, a number of legal scholars have argued against this view.3,5 Both arguments emerge from a U.S.-centric context, which, as the Microsoft antitrust case in the 1990s showed, is not the only legal arena for regulation. Moreover, in legal circles it has been suggested that the existing regulatory frameworks may be inadequate for something as groundbreaking as Internet search. Given the state of the law, governments and multinational bodies may need to create a new regulatory framework.

Back to Top

Search and Its Stakeholders

Who are the stakeholders and what are their interests? Mitchell wrote19 that stakeholders are characterized by power, legitimacy, and urgency. Building on a broad investigation of stakeholder interests in search-engine law by Grimmelmann,15 we see four main actors—users, search engines, Webmasters, and search-engine optimizers—that are, to some extent, characterised by Mitchell’s attributes.

Table 1 is an overview of key stakeholder interests in Internet search bias and stakeholder recognition of possible manipulation. Search-engine optimizers are particularly conflicted. On the one hand, they stand to gain from greater transparency in Internet search, as their business would be easier and more efficient, and a clearer picture of accepted practices would enable them to guarantee their clients that search engines tolerate their techniques. On the other hand, they profit from lack of transparency, as it raises the value of their key assets—expertise and inside knowledge. Moreover, search-engine optimizers with good contacts and community-wide attention can profit from their influence with search-engine companies in “curing” cases of rank demotion. Therefore, top performers arguably have little interest in greater transparency.

Commentators have made several proposals for regulating targeted manipulations. We comment first on the two most promising, then introduce a new approach (outlined in Table 2).

Obligation to provide reasons for rank demotion. One proposal20 suggested establishing an obligation to provide a reason for rank demotion to increase transparency in the relationship between Webmasters and search engines. In this context, Pasquale20 drew an interesting analogy with credit-reporting agencies providing reasons for adverse credit information to consumers.20 It would favor the interests of users and especially Webmasters, because it would support Website optimization for search engines. Search-engine optimizers would probably be divided between the less influential that gain and top performers that lose from greater transparency. Building on the credit-agency analogy, Pasquale20 argued that the cost to search engines and risk of algorithmic reverse engineering would be low.20 However, as the number of potential queries on rank demotion is arguably much higher than the number on adverse credit ratings, cost to the search engines would likely be significant. Moreover, search engines would likely oppose any obligation to provide precise reasons for each rank demotion, as it would increase the risk of lawsuits.

Installation of ombudsmen and process of appeal. Taking the previous proposal a step further, some have called for an appeal process against rank demotion.24 For example, Forsyth10 emphasized if search engines were public entities with ensuing accountability, a process of appeal would probably already have been established.10 While Google offers a way to submit pages for “reconsideration,”8 such appeals are judged internally without transparency, and, while successful in some cases, a Webmaster’s only option might be to attract enough publicity to get a search-engine representative to take up the case internally.9

Installing a formal, transparent appeals process would clearly be in the interest of both users and Webmasters, while assisting some search engine optimizers but diminishing the value of top optimizer contacts. On the other hand, search engines would arguably incur somewhat higher costs for installing such a process. Moreover, a formal process with a clear chance of success would facilitate appeals, thereby increasing numbers of requests and probably encouraging “appeal gaming.”

Clearer guidelines for search engine optimization. Current guidelines between black-hat and white-hat search engine optimizers are gray, often forcing Webmasters and optimizers to speculate as to which techniques would be punished through ranking degradation. In a personal interview on search engine optimization, one search-engine consultant said the key to differentiating between black-hat and white-hat optimization techniques in unclear cases is “implication of intent.” However, in the 2008 case of Hewlett-Packard mentioned earlier, comments by search-engine optimizers indicate the existence of at least some bias distinguishing between permitted and illicit optimization methods.

One approach to increasing transparency in the relationship between Webmasters and search engines is to establish clearer guidelines distinguishing black-hat from white-hat optimization. The gray area in between would be diminished, giving Webmasters and optimizers a better idea of what to expect from search engines. The diminished gray area would lead to more consistent application of ranking degradation, as questionable sites would fall more clearly into one category or the other. Moreover, new guidelines could also cover search engines’ assessment of intent in questionable cases, further promoting consistent treatment of market players. This approach is promising because it has advantages for all stakeholders (see Figure 2).

Clearer guidelines. A self-regulatory approach initiated by regulators would be the easiest and most efficient way to achieve clearer guidelines. There is ample precedence that self-regulation works in cyberspace; for example, in the early days of Internet search, many search engines did not distinguish between organic and paid results (advertisements). However, following a letter by the U.S. Federal Trade Commission in 2002 recommending search engines ensure that “paid […] results are distinguished from non-paid results with clear and conspicuous disclosures,”16 all major search engines implemented such practices. Also, self-regulation would help the process of regulation keep up with the pace of technological change. Finally, as clearer guidelines would arguably favor all stakeholders, search engines would at least join the public dialogue on self-regulation.

Policymakers could signal the importance of targeted manipulation and initiate a dialogue on self-regulation by creating a committee of key stakeholders to examine cases of rank demotion and recommend ways to improve today’s optimization guidelines. In addition, the topic of search-engine regulation could be put on the agenda of the next United Nations Internet Governance Forum (www.in-tgovforum.org).

Self-regulation alone may not alleviate concern about rank demotion. One idea from the Internet’s early days may chart another way forward. As disputes over domain names became more heated in the 1990s and U.S. trademark law proved insufficient, the Internet Corporation for Assigned Names and Numbers (www.icann.org) and the World Intellectual Property Organization (www.wipo.int) developed the Uniform Domain-Name Dispute-Resolution Policy (www.icann.org/en/udrp/udrp.htm) to promote quick and inexpensive resolution of domain-name conflicts.6 A similar body could help establish new optimization guidelines and manage the mediation process.

Back to Top

Conclusion

Here, we’ve argued for the need to open a debate on how to regulate targeted ranking manipulation that hinders search-engine optimization. These practices threaten democracy and free speech, fairness, market efficiency, autonomy, and freedom from deception. Making the case for regulation can be based on the search market’s failure of information asymmetries and concentration of market power over an essential product in a private entity. Our analysis of specific regulatory proposals and their implications for stakeholders highlights the benefits of establishing clearer guidelines for optimizers.

Back to Top

Back to Top

Back to Top

Back to Top

Figures

F1 Figure 1. How ranking works.

F2 Figure 2. How clearer search-engine optimization guidelines would affect stakeholders.

Back to Top

Tables

T1 Table 1. Stakeholder interests in the regulation of search bias.

T2 Table 2. Regulatory proposals and stakeholder interests.

Back to Top

    1. Anderson, E. Value in Ethics and Economics. Harvard University Press, Cambridge, MA, 1993.

    2. Battelle, J. The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture. Nicholas Brealey Publishing, London, 2005.

    3. Bracha, O. and Pasquale, F. Federal Search Commission? Access, Fairness and Accountability in the Law of Search. University of Texas Public Law Research paper no. 123, Austin, TX, 2007.

    4. Breyer, S. Typical justifications for regulation. In A Reader on Regulation, R. Baldwin, C. Scott, and C. Hood, Eds. Oxford University Press, New York, 1998 59–93.

    5. Chandler, J. A right to reach an audience: An approach to intermediary bias on the Internet. Hofstra Law Review 35, 3 (Spring 2007), 1095–1139

    6. Davis, G. The ICANN uniform domain name dispute resolution policy after nearly two years of history. e-On The Internet (Jan./Feb. 2002); http://www.isoc.org/oti/articles/1201/icann.html

    7. Elkin-Koren, N. Law, Economics and Cyberspace. Edward Elgar Publishing, Cheltenham, U.K., 2004.

    8. Fishkin, R. White-hat cloaking: It exists, it's permitted, it's useful. SEOMoz Blog (June 30, 2008); http://www.seomoz.org/blog/white-hat-cloaking-it-exists-its-permitted-its-useful

    9. Fontenot, D. Matt Cutts, Why am I still being punished? SEO Scoop (Jan. 24, 2008); http://www.seo-scoop.com/2008/01/24/matt-cutts-why-am-i-still-being-punished/

    10. Forsyth, H. Google MP? How the Internet Is Challenging Our Notions of Political Power. Presentation at Pembroke College Ivory Tower Society, Cambridge, U.K. (Jan. 28, 2008).

    11. Goldman, E. KinderStart v. Google dismissed: With sanctions against KinderStart's counsel. Technology & Marketing Law Blog (Mar. 20, 2007); http://blog.ericgoldman.org/archives/2007/03/kinderstart_v_g_2.htm

    12. Goldman, E. Search engine bias and the demise of search engine utopianism. Yale Journal of Law & Technology 8 (Spring 2006), 188–200.

    13. Google, Inc. Webmaster Tools Help Cloaking, Sneaky Javascript Redirects, and Doorway Pages (Dec. 4, 2008); http://www.google.com/support/webmasters/bin/answer.py?answer=66355&topic=15263.

    14. Greenberg, A. Condemned to Google hell. Forbes (Apr. 4, 2007); http://www.forbes.com/2007/04/29/sanar-google-skyfacet-tech-cx_ag_0430googhell.html

    15. Grimmelmann, J. The structure of search engine law. Iowa Law Review 93, 1 (Nov. 2007), 3–63.

    16. Hippsley, H. Letter from FTC to Search Engines Regarding Commercial Alert Complaint Requesting Investigation of Various Internet Search Engine Companies for Paid Placement and Paid Inclusion Programs. Federal Trade Commission, Washington, D.C., June 27, 2002; http://www.ftc.gov/os/closings/staff/commercialalertattatch.shtm

    17. Introna, L.D. and Nissenbaum, H. Shaping the Web: Why the politics of search engines matters. The Information Society 16, 3 (2000), 169–185.

    18. Miles-LaGrange, V. SearchKing, Inc. v. Google Technology, Inc. CIV-02-1457-M.U.S. District Court for the Western District of Oklahoma, 2003.

    19. Mitchell, R.K., Agle, B.R., and Wood, D.J. Toward a theory of stakeholder identification and salience: Defining the principle of who and what really counts. The Academy of Management Review 22, 4 (Oct. 1997), 853–886.

    20. Pasquale, F. Rankings, Reductionism, and Responsibility. Seton Hall Public Law Research paper. Seton Hall University, South Orange, NJ, Feb. 25, 2006.

    21. Pitofsky, R. The Essential Facility Doctrine Under United States Antitrust Law. Federal Trade commission, Washington, D.C., 2001; http://www.ftc.gov/os/comments/intelpropertycomments/pitofskyrobert.pdf

    22. Sherman, C. Search engine users: Loyal or blasé Search Engine Watch (Apr. 19, 2004); http://searchenginewatch.com/3342041

    23. Sullivan, D. Google penalizes Google Japan for buying links. Search Engine Land (Feb. 11, 2009); http://searchengineland.com/google-penalizes-google-japan-16541

    24. Sullivan, D. Google ombudsman? Search ombudsman? Great idea: Bring them on! Search Engine Watch (July 2006); http://blog.searchenginewatch.com/blog/060706-075235

    25. Wall, A. Strategic content as marketing for link building. SEO Book (May 8, 2008); http://www.seobook.com/content-marketing-win

    DOI: http://doi.acm.org/10.1145/1839676.1839695

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More