Research and Advances
Architecture and Hardware

Tightening Access Policy: The Hidden Trade-Off

Access policy changes should be handled delicately to avoid damaging visitor loyalty.
Posted
  1. Introduction
  2. Effects of Tightening Access Policy
  3. Discerning User Groups
  4. Conclusion
  5. References
  6. Author
  7. Footnotes
  8. Figures
  9. Tables

When a Web site begins operating, a relatively loose access policy is often adopted to quickly acquire traffic. However, the pursuit of high traffic levels may not be the most sensible focus for a new Web site. Traffic that only consumes resources may actually be undesirable. Hence, it is logical for Web organizers to seek opportunities to tighten access to apportion resources to better serve Web site members. Access restrictions are usually imposed on popular Web sections to limit the consumption of system resources. It is conventional wisdom that such restrictions are desirable because they effectively distinguish the value of visitors and allocate Web resources accordingly.

Changing the access policy of a Web site to restrict a certain section to members after an initial loose access policy is quite common [4, 5]. Tightening the access policy can be implemented effortlessly by announcing the new restriction. The potential benefits include a wiser allocation of resources and a boost to the member population due to the desirability of the restricted section. However, the existence of these benefits is speculative and empirical evidence is lacking. An in-depth study is required to understand the traffic dynamics to confirm the benefits of access tightening policies.

This study examines the effect of access tightening at the macroscopic level, which is measured in terms of overall visitor behavior. This is in contrast to a microscopic study, which measures Web server efficiency in terms of response time and delay [3]. In this experiment, a Web site was created and operated. A members-only policy change was introduced at a specific point in the operation of the Web site. To determine whether there was a change in performance, the traffic data before and after the intervention was compared to reveal the potential cause-effect relationships between the intervention and the subsequent performance of the site. As traffic performance is dynamic and traffic measures may be interrelated, other traffic measures in addition to the change in membership population were tracked.

Due to IT advances, Web visitors now leave “traces” that help Web organizers assess site performance. A particularly useful aspect of this is the ability to extract Web site activity data from the Web server log files. According to Adler [1], there are three types of Web traffic measures:

  • Basic measures assess how Web resources are used. These include click-related measures, such as page views and total visitors.
  • Stickiness measures assess member loyalty. These include the long-term measure of the number of sessions per month, and the relative short-term measure of session length.
  • Measures of participation look at levels of involvement of members in the virtual community, such as the length and frequency of participation in chat rooms and discussion groups, the number of postings contributed, and the number of most active members. The number of registered members and the amount of money spent also fall in this category.

The experimental Web site in this study did not involve monetary transactions, and it was not designed to nurture a virtual community. Since long-term measures can easily blur cause-effect relationships, this study focuses on four traffic measures that do not involve purchasing or long-term measures, and that are not specific to the attractiveness of a virtual community: page views, visitor count, daily registrations, and average duration. Page views measure the total number of pages viewed by visitors per day. Visitor count measures the number of unique visitors per day. Daily registrations measure the number of new registered members per day. Average duration is the average time that each visitor spends at the Web site per visit.

A quasi-experimental design was employed, which involved conducting an experiment in the real world. A Web site was built following all of the known guidelines, such as the use of a balanced layout, navigation tools, decision aids, a sitemap, contact information, and a FAQ section. Several external evaluations certified the adequacy and popularity of the experimental Web site, including third-party evaluation of top performing sites in the same Web site category of fashion and fitness. There are three types of visitors: non-members, opt-in members, and opt-out members. The domain name was fit.nccu.edu.tw. Four content sections were devised: women’s fitness, men’s fitness, gourmet food, and a section comprised of candid pictures taken on the street. Extensive editorial efforts provided fresh content in each of the four sections every week, and many interactive functions were also offered, such as Web email, an e-newsletter, bulletin boards, chat rooms, and a topic-ranking mechanism. Regular backups of the Web site averaged around 600MB in size, which comprised about 170MB of site content, 300MB of log files, 80MB of statistical data, and 60MB of internal documents. Text and images made up most of the content, and the display of information was supplemented by background music and animation. Around 700 small images were used in the layout, and around 30 images that illustrated some of the information were updated each week. All of the information was updated weekly, with the exception of the user-supplied information, long-term announcements, and layout template.

Visitors were gathered naturally on the Internet without employing a sampling procedure and without visitors being informed of the Web site’s potentially short lifespan. Observant visitors might have noticed from the domain name that the Web site was organized by an educational institute, and may have conjectured that it was probably not a long-term endeavor. However, Web sites with an “.edu” in their domain name also tend to be perceived as relatively trustworthy, and with the extensive editorial effort and regular updates, the site would not be perceived as likely to disappear overnight. The Web site was operated and continuously updated for nine months and its closure was announced one month before the last update. To end the service as gracefully as possible, an online survey of service satisfaction with an attached statement of appreciation was conducted. In addition, the Web site remained accessible after the last update, and the service was not completely turned off until five months later.

To confirm that the observed effect was indeed caused by the administered intervention, so-called “history factors” had to be ruled out. History factors are events that occur during an experiment that are not intentionally administered but have the potential to overshadow the intervention and affect the results. During this experiment, a journal was kept online to record all of the events that occurred each day. All 17 Web masters were responsible for maintaining the journal entries, and a designated Web master monitored the entries to ensure their accuracy and thoroughness. The journal served as a record that could be consulted when an unexplained performance was observed.

Back to Top

Effects of Tightening Access Policy

The accompanying figure plots the data for page views, visitor count, average duration, and daily registrations from Day 1 to Day 80. The access policy change took effect on Day 43. There were a few service interruptions in the first few days due to power outages. For purposes of analysis, the data taken from the initial few days and the two days immediately after the intervention was omitted to avoid transient instability. As time series are rarely characterized by more than two time lags [2], the exclusion of two days of data was enough to eliminate time correlation with the data prior to experiment intervention. The data taken from Day 15 to Day 70 was divided into two groups, the pre-intervention group and the post-intervention group.

In the figure, the period of analysis is shaded and the date of intervention is indicated by the arrow. Table 1 compares the overall page views, visitor count, average duration, and daily registrations before and after the policy change without differentiating among user groups. Taking 5% as the maximum tolerable error of mistakenly validating a change in traffic performance when there was none, the changes in average duration and daily registrations are found to be statistically significant, but the changes in page views and visitor count are not significant.

Of the four traffic measures, daily registrations do not have meaning across all of the member groups, because only non-members can register to become members. The “Overall Effect” column clearly shows that the change of access policy provided only one positive result: it increased the number of daily registrations. In a way, the daily registrations had to increase because many non-members were essentially forced to become members when the membership requirement was imposed on the most popular section of the Web site. The changes in page views and visitor count, if any, were not statistically significant, and average duration showed an undesirable decrease after the access policy change. It is possible that the decrease occurred because new members did not stay on the site as long as older members who registered willingly before the access policy was tightened. However, the probable difference in the length of stay between new and old members alone cannot fully account for the decrease in the average visit duration.

Back to Top

Discerning User Groups

The reason for the decreased average visit duration must be investigated, and the analysis taken one step further to understand the behavior of each user group in the measures related to the type of user group, which are page views, visitor count, and average duration. For the measures that showed no change from the overall analysis in Table 1, it is necessary to determine whether they remained at the same level for each group, or whether the effects were averaged out across the user groups, with the change in each user group occurring behind the scenes. For the average duration measure that showed a decrease, it is necessary to determine whether all of the user groups behaved in unison.


The hidden trade-off between membership and visitor loyalty revealed in this study challenges the conventional wisdom of the benefit of tightening access policy.


The data was thus subsequently separated into the non-member, opt-in member, and opt-out member groups. The column labeled “Individual Group Effect” in Table 2 summarizes the average effect on each user group, and the overall effects in Table 1 are replicated in the right-hand column for easy comparison. As the “Overall Effect” column shows, the average page views and the average visitor count showed no significant change. However, the analysis of the individual group effects reveals the average page views of the non-members actually fell, whereas the average page views of members rose. In addition, although the overall effect of the intervention on visitor count was insignificant, the number of opt-out member visits did increase. Despite being statistically insignificant, the average number of non-member visitors implies a probable decrease, whereas that of opt-in members implies a probable increase. The decrease in the average visit duration was consistent across all user groups.

The analysis of performance by user group reveals that the traffic dynamics are not that apparent in the overall analysis. Together, these two analyses offer several important insights. The fact that the number of overall page views remained the same, whereas those for non-members dropped off and those for members rose clearly points to a shuffling effect. Contrary to what would be concluded from the overall analysis alone, the analysis by the user group indicates the overall average number of page views did not change, but were merely redistributed. As for visitor count, the data implies a drop in the number of non-member visitors and an increase in the number of both opt-in and opt-out member visits, although the measure is only statistically significant for the opt-out members. The increase in opt-out member visits is submerged in the overall picture due to the relatively small numbers of these members. Throughout the operation of the experimental Web site, the number of opt-in members consistently exceeded the number of opt-out members by a large amount, perhaps as a result of the demographic composition of the members. The personal information of the members showed them to be mostly between the ages of 18 and 30, and young people are more likely to choose to opt-in, as is suggested by Milne and Rohm [6].

The shuffling effect of the access policy change on both page views and visitor count, and a decrease in the average duration across all user groups are rather alarming. If, in response to the access restriction, only a shuffling effect occurred, then the overall average visit duration should also have remained at the same level, as with the number of page views and the visitor count. It is possible that the less loyal new members who became members only after the policy change brought down the average duration of the visits of the member groups, and the remaining non-members who did not become members after the policy change were the least loyal of the previous non-members before the policy change, which led to the decrease in average visit duration for the non-member group. However, a mere shuffling of visitors between the user groups cannot explain the decrease in the overall average visit duration across all of the user groups. If visitors were merely shuffled between the groups, with the average duration for each visitor remaining unchanged, then the overall average duration—or the average duration for all visitors—should also have stayed the same.

This contradiction discloses the hidden trade-off of tightening access policy: although membership increased, visitors, whether members or non-members, were annoyed or even aggravated by the access restriction and began to show signs of deflection. The tightening of the access policy effectively converted some non-members into members, and caused redistribution in the page views and visitor count statistics. The gain in the number of new members seemed to indicate a positive contribution from the policy change. However, the average visit duration fell across the board, which leads one to question whether a Web site truly gains from a tightening of its access policy. Average duration is a measure of loyalty. This study suggests that visitors were shuffled between groups after the gain in new members, but their behavior changed for the worst and their loyalty decreased. Gaining new members but losing the loyalty of visitors—both members and non-members alike—can hardly be a desirable trade-off. Therefore, the lesson of this study is twofold: one should think twice before implementing an access policy change, and if it is deemed necessary to conserve resources, then it should be handled delicately to reduce the chances of damaging loyalty.

As few traffic experiments exist in the literature, this study and its implication are unique. Although the results of this study are only valid for the Web site category to which the experimental Web site belongs, it is unlikely that any one study could cover all of the different categories of Web sites. It is still too early to conclude whether all Web sites would follow the pattern found in this study, but the results still offer valuable insights that are relevant to the operation of all Web sites. The hidden trade-off between membership and visitor loyalty revealed in this study challenges the conventional wisdom of the benefit of tightening access policy.

Back to Top

Conclusion

The findings of this study provide important insights into Web site traffic dynamics. Nevertheless, the question of whether the same results would occur if the access policy change were manipulated differently remains. For example, if changes were introduced piecemeal instead of all at once on a specific set date, then visitors may have responded differently. It is very likely that in a different setting, different quantitative outcomes would result, although the qualitative results should be similar.

Various principles and practices for tackling the problem of developing Web sites with high traffic levels exist, and many are clearly working quite well. With billions of Web pages online and numerous Web sites in operation, the purpose of this study is not to discover yet another way to improve Web design, but to address an important question that arises after a Web site is in operation: How can the most popular Web section help a Web organizer to further enhance the value of the site and move it closer to the realization of commercial benefit? During the initial booming stages of the World Wide Web, people rushed to adopt all of the possible ways to make “good” Web sites, despite the fact that quantified accounts of what makes a Web site “good” were and still are scarce. Measures that claim to be effective in nurturing a Web site’s growth are rarely supported by quantified data. Starting from scratch is usually less challenging, as the worst that can happen is to have to start again. However, when the hard-earned first fruits of a Web site are already in sight, great care should be taken when developing the site further.

Back to Top

Back to Top

Back to Top

Back to Top

Figures

UF1 Figure. Data of the four traffic measures.

Back to Top

Tables

T1 Table 1. Overall effects of the change in access policy without distinguishing user group.

T2 Table 2. Comparison of the group effects and overall effects.

Back to top

    1. Adler, R.P. Doing research in online communities. May 25 1999; Retrieved May 30, 2006, from www.digiplaces.com/pages/present/ research/index.htm.

    2. Box, G.E.P., Jenkins, G.M., and Reinsel, G.C. Time Series Analysis—Forecasting and Control (3E). Prentice Hall, 1994.

    3. Datta, A., Dutta, K., Thomas, H., and VanderMeer, D. World wide wait: A study of Internet scalability and cache-based approaches to alleviate it. Management Science 49, 10 (2003), 1425–1444.

    4. Frick, C. NCOIL defends Web access policy. National Underwriter/Property & Casualty Risk & Benefits Management 108, 5 (Feb. 9, 2004), 25–26.

    5. Hayes, D. NCOIL Web access restrictions draw fire. National Underwriter/Property & Casualty Risk & Benefits Management 108, 4 (Feb. 2, 2004), 5.

    6. Milne, G.R. and Rohm, A.J. (2000). Consumer privacy and name removal across direct marketing channels: Exploring opt-in and opt-out alternatives. Journal of Public Policy and Marketing 19, 2 (2000), 238–249.

    This study is partially supported by the National Science Council of the Republic of China under the project number NSC-89-2416-H-004-090.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More