Research and Advances
Computing Applications

Examining Differences Across Journal Rankings

Many have studied and ranked the quality of computing journals over the last 15 years. This composite of the top 50 was created by examining how those rankings fared over time and across studies.
Posted
  1. Introduction
  2. How Do We Compare Across Journal Rankings?
  3. Conclusion
  4. References
  5. Authors
  6. Tables

Publication outlets for MIS scholars are important for many reasons, namely they unify an academic discipline by providing a communication system for acquiring and disseminating information; they are used in hiring, promotion, tenure, and merit pay decisions; and they are used for ranking academic departments. Such outlets also provide researchers with target vehicles for their work; they help researchers to identify streams of research in an academic discipline; and they are used by librarians to optimize disbursement of available funds.

The importance of journals in a discipline naturally leads to the question of relative journal quality. As a result, a number of studies have ranked a variety of journals (many not solely devoted to MIS). These studies differ in a number of ways, including the size and composition of respondent samples, the number of journals included, the methods used for including journals, and the methods used for ranking the journals. Further, each of these studies provides a journal ranking at one point in time.

To address the variability across journal ranking studies, we present a method to average journal rankings across studies. We use this method with nine such studies published between 1991–2003 to produce a composite ranking of the top 50 journals across these studies. Table 1 shows the nine studies, the number of journals ranked in each study, the research methodology used, and the sample size. These studies present several points of interest.

Two studies used citation analysis to rank journals [4, 5] and the remaining seven employed the perceptions of respondents to rank journals. The use of citation analysis is noteworthy as this method is purported to be more objective than using respondent perceptions. Holsapple et al. took a further step with their citation analysis by controlling the number of years each journal had been in publication [5].

Several studies using respondent perceptions are also notable. Whitman et al. [8] collected the most widespread sample of respondents from a mailed survey. Their study also provided the most thorough list of journal rankings across the nine studies. Mylonopoulos and Theoharakis [6] and Peffers and Tang [7] used online surveys to obtain the largest respondent samples, as well as the greatest international representation across the nine studies.

Table 1 also points out interesting trends in journal ranking studies, the first being that sampling methods have progressed from mailed, to emailed, to online surveys. One key advantage of online surveys—convenience—has undoubtedly contributed to two other trends in these studies: increasing sample sizes and increasing numbers of international respondents. The fourth trend is an increasing number of journals for respondents to rank.

The nine studies have employed a variety of methods to obtain a list of journals (of varying number) to be ranked. Regardless of the methods used to include journals in a ranking study, or the methods used to rank them, each study produced a journal ranking, which we analyze here.

Back to Top

How Do We Compare Across Journal Rankings?

To be able to average journal rankings across studies, we had to calculate a common denominator to account for differing numbers of journals in each ranking. Accordingly, we calculated the score for each journal in each study. We divided the rank of each journal by the total number of journals ranked in that study, resulting in that journal’s score (see Table 2). For example, MIS Quarterly ranked first in the 1999 [9] and 2001 [6] studies, with scores of .01 and .02 respectively, because the 1999 study ranked 80 journals and the 2001 study ranked 50 journals.

Scores close to zero indicate highly ranked journals, where scores approaching one indicate lower-ranked journals. We then averaged each journal’s scores across the studies in which that journal appeared to obtain its average score. We used the average score to rank the top 50 journals (see Table 2). Ties were resolved (where possible) based on the number of ranking studies in which the journals appear. Table 2 presents the rank of each journal that appeared in each study, the journal’s score in that study (in parentheses), and the journal’s average score across studies.

Table 2 ranks the top 50 journals across the nine studies from 1991–2003. We make no attempt to classify journals as top-tier (or “A” list), second-tier (“B” list), and so on. However, the composite ranking of the 50 journals does provide a comprehensive view of the relative quality of the journals from the standpoint of MIS scholars.

Table 2 also shows how journals change in rank over time. Some journals vary (for example, Decision Sciences, IEEE Computer, and IEEE Transactions on Systems, Man, and Cybernetics) while others are quite consistent (for example, MIS Quarterly, Communications of the ACM, Information Systems Research, Management Science, and Journal of Management Information Systems). Eight journals appear in all nine studies (MIS Quarterly, Communications of the ACM, Management Science, Journal of Management Information Systems, Harvard Business Review, Decision Sciences, Information & Management, and Sloan Management Review). Four journals appear in eight studies (Decision Support Systems, IEEE Transactions on Software, IEEE Computer, and ACM Computing Surveys) and another four journals appear in seven studies (Data Base, Interfaces, Information Systems Management, and Journal of Systems Management).

Table 3 reflects the rich diversity of the journals in which MIS scholars publish their research. We find 29 “pure” MIS journals. Demonstrating the MIS field’s main reference disciplines, we note 11 computer science journals, seven management journals, and three operations research journals. It is interesting that, out of the top 20 journals, only six are “pure” MIS journals, nine are computer science journals, two are management journals, and three are operations research journals. These findings point out the breadth and interdisciplinary nature of MIS research.

Despite movements in the rank of many individual journals, the overall journal rankings have remained remarkably consistent over time, providing evidence the MIS field is forming a consensus on its potential publication outlets and their relative quality. These findings suggest that MIS is maturing as a coherent, academic discipline.

As a result of the consistency across journal rankings, the question arises as to whether or not future journal ranking studies will provide value. The answer is an unqualified “yes” because the MIS field is very dynamic, with new technologies constantly emerging. Therefore, the MIS field continues to evolve, new journals appear, and future journal rankings will include these new outlets. In fact, many journals that have appeared more recently are highly regarded by the MIS community (for example, Communications of the AIS and the European Journal of Information Systems to note just two).

In addition, future journal rankings should continue to examine regional differences in perceptions of journal quality (see [6]). We feel that future global ranking studies will be both useful and informative.

How should future journal rankings be conducted? One suggestion would be to provide a comprehensive list of journals in an online survey and have MIS faculty rank the journals. As this list would be lengthy, respondents could rank some number of journals in order or they could indicate the perceived quality of each journal with which they are familiar on Likert scales. Respondents would be free to add journals not on the list. Following our methodology in this study, future journal rankings can be added to our list of rankings, new average scores obtained for each journal, and new composite rankings calculated.

Back to Top

Conclusion

The composite journal rankings smooth out differences in the methods used to rank journals and differences in the methods used to include journals in the rankings. We have provided a comprehensive overview of journal ranking studies over a 12-year period and a composite ranking of the top 50. However, our ranking is not the last word as future ranking studies will certainly change these rankings.

Back to Top

Back to Top

Back to Top

Tables

UT1 Table. Journal abbreviations.

T1 Table 1. Previous journal ranking studies from 1991–2003.

T2 Table 2. Composite journal rankings for studies (1991–2003).

T3 Table 3. Journals and their reference disciplines.

Back to top

    1. Doke, E., Rebstock, S. and Luke, R. Journal publishing preferences of CIS/MIS scholars: An empirical investigation. J. CIS 36 (1995), 49–64.

    2. Gillenson, M. and Stutz, J. Academic issues in MIS: Journals and books. MISQ 15 (1991), 447–452.

    3. Hardgrave, B. and Walstrom, K. Forums for MIS scholars. Commun. ACM 40, 11 (Nov. 1997), 119–124.

    4. Holsapple, C., Johnson, L., Manakyan, H., and Tanner, J. A citation analysis of business computing research journals. Info. and Manage. 25 (1993), 231–244.

    5. Holsapple, C., Johnson, L., Manakyan, H., and Tanner, J. Business computing research journals: An normalized citation analysis. J. MIS 11 (1994), 131–140.

    6. Mylonopoulos, N. and Theoharakis, V. Global perceptions of IS journals. Commun. ACM 44, 9 (Sept. 2001), 29–33.

    7. Peffers, K. and Tang, Y. (2003). Identifying and evaluating the universe of outlets for information systems research: Ranking the journals. J. Info. Tech. Theory and App. 5 (2003) 63–84.

    8. Walstrom, K., Hardgrave, B., and Wilson, R. Forums for management information systems scholars. Commun. ACM 38, 3 (Mar. 1995), 93–107.

    9. Whitman, M., Hendrickson, A., and Townsend, A. Academic rewards for teaching, research, and service: Data and discourse. Info. Systems Research 10 (1999), 99–109.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More