Sign In

Communications of the ACM

ACM TechNews

Computer Scientists Cry Foul Over Data Problems in Nrc Rankings

View as: Print Mobile App Share:

The National Research Council's (NRC's) use of a computerized methodology to rank computing research doctoral programs at the behest of the Computing Research Association has provoked a backlash by computer scientists. The NRC agreed to revise research productivity measurements by counting not just journal articles but also presentations at computer science conferences, but critics say the council's report is riddled with errors. The University of Utah's Martin Berzins says the NRC grossly miscounted the number of faculty members' journal articles and conference presentations between 2000 and 2006 when compared to his records.

"This is a data-based report," says the University of Washington's Henry M. Levy. "For it to have any validity, the underlying data need to be accurate." Levy cites, for example, the NRC's practice of collecting data about major scholarly awards and honors held by faculty members from scholarly societies rather than from doctoral programs directly as a significant contributor to the report's inaccuracies.

The NRC has announced a general process for assessing possible data errors in the report, but says that it will likely refrain from updating any of the program rankings except in cases that clearly trace the errors to the project's staff.

From The Chronicle of Higher Education
View Full Article


Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA


No entries found