Sign In

Communications of the ACM

Editor's letter

Is the Image Crisis Over?


Communications Editor-in-Chief Moshe Y. Vardi

Has anything changed in that regard in the last 17 months?

Let us first recall what the "image crisis" was. The computing field went through a perfect storm in the early 2000s: the dot-com and telecom crashes, the offshoring scare, and a research-funding crisis. After its glamour phase in the late 1990s, the field seems to have lost its luster. This has resulted in a precipitous drop in North American enrollments in undergraduate computer science programs. The Computing Research Association's Taulbee Survey traced how U.S. computer-science enrollments started dropping in 2001, reaching 50% of the 2000 level in 2005, and then staying flat through 2007. It seemed that high-school students made a collective decision that computing is not a promising career, and simply voted with their feet.

The enrollment plunge led to the establishment of the Image of Computing Task Force in 2005 as a U.S. response to "lead a national coordination effort to expose a realistic view of opportunities in computing." In 2008, the U.S. National Science Foundation funded a joint project with the WGBH Educational Foundation and ACM to "research and design a new set of messages that will accurately portray the field of computing." Then, in March 2009, the 2007–2008 Taulbee Survey came out with data indicating that freshmen CS enrollments grew by almost 10% between 2007 and 2008.

So, is the "image crisis" over?

Taking a longer-term view of CS enrollments, one notices that the enrollments drop of the early 2000s came after a huge rise in enrollments in the late 1990s. In fact, CS enrollments have always been cyclical. The latest boom-bust cycle was triggered by the "Internet revolution," but an earlier boom-bust cycle, in much of the 1980s, was triggered by the "PC revolution." It is not clear what a "normal" level of CS enrollments would be. Was the "crisis" a real crisis?

There is no doubt that CS enrollments are hugely affected by the economic environment. The most recent rise in enrollments is probably tied to the recent economic crisis. While finance looked like a promising career two years ago, that gleam is clearly tarnished. A career in computing suddenly seems much more promising than a career on Wall Street. The rise in CS enrollments will probably continue to rise for the next few years.

We should not, however, let a good "crisis" go to waste. From it we learned that the image of our field is important and that the image of our field is not necessarily a positive one. Trying to change that image is a noble goal, though I doubt nothing short of a massive marketing campaign, at the probable cost of tens of millions of dollars, can add more color to the prevailing picture.

What we did not seem to learn is that the image of our field may be related to the reality of our field. We are woefully ignorant about the reality of computing careers. According to the Bureau of Labor Statistics, there are close to four million information technology workers in the U.S. that span across several job categories. (Analogous information about the rest of the world is very difficult to obtain.) What we do not know is how computing graduates fit in this picture. We do not know how to measure the career outcome or value of a computing degree. How are CS graduates distributed across various industry sectors? What are their career trajectories? Peter Freeman and William Aspray's 1999 report The Supply of Information Technology Workers in the United States, studied the supply of and demand for IT workers in the U.S. at a macro rather than the individual scale in an effort to better understand these issues. Yet today we still do not know how to determine the quality of computing careers, say in terms of lifetime income, job stability, autonomy and self-direction, promotion opportunity, and the like, compares to other careers, say, in electrical or chemical engineering. Moreover, we clearly do not know how to change the image problem of our field as viewed by women and most minorities.

These, I believe, are huge gaps in our knowledge. I do not see how we can develop messages that will accurately portray the field of computing, if we ourselves do not have an accurate portrait of the field. Before the "image crisis" completely fades away, let us not let it go to waste.

Moshe Y. Vardi,
EDITOR-IN-CHIEF

Back to Top

Footnotes

DOI: http://doi.acm.org/10.1145/1592761.1592762


©2009 ACM  0001-0782/09/1100  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2009 ACM, Inc.


Comments


Joel Malard

Is computer science really going through an image crisis? Is it not precisely what a typical teenager says, especially one with a tint of a boundary issues on the parental side.

What is so special about computer science that it should survive computers as we know them? The high caliber computer scientists, engineers, physicists, and mathematicians that I met in my professional life can solve each others undergraduate problems, which form 90% of daily work. The discriminator is how their distinct intuitions lead them to solve problems.

What we have not clearly articulated as a profession to the general public and especially to young people in middle and high schools, is how can computer science help them widen, deepen, strengthen their young minds, in short make them taller. Why should a young ambitious person decide to spend critical years learning techniques that may become obsolete within the next 10 years without a clear, lasting and tangible benefit?

In the CACM series of article and opinions, I liked very much the interview with Shaw. For one, it showed in plain language what the computer science intuition can lead to and the enthusiasm that goes along with it. I also liked the microscope metaphor. Microscopes opened up our eyes to a whole new world right at the tip of our noses, one that existed from the very beginning and whose discovery had a huge positive impact on daily lives throughout the world. Computers made computable functions a graspable reality and in so doing they opened up our eyes to a whole new world of possibilities. But then to carry on with the parallel, where are the microscopology departments these days? Microscopes have no human core value, they do nothing for the mind, and their study and usage is scattered throughout multiple departments.

There is a theatrical side to the profession, these days we have the computer science enrollment crisis, before that we had the programming crisis, and as computers go pervasive we may even have a where is my computer science department crisis. I hope that we can articulate who we are as a profession in human terms not merely in quantitative terms, and especially not in terms of regulating where to, what to or who may publish or get funding. I dont like to say this but in this matter may the market rule. Conferences were good to people who wanted to boost their publication records and there is a plethora of obscure topical conferences, people claiming to review 200 papers a year and all this non-sense, but with time departments get more choosy, maybe one day they will ask what volunteer activities their aspiring faculties got involved into, what sports they play, who knows, but whatever, common sense will eventually prevail and the quicker so with a clear understanding of the core human values of our field.

As for my 2 cents worth of predictions, if I may, microbes were discovered in the 17th century and one still find educated people who dont recognize that sneezing and finger licking propagate the flu more effectively than say looking into peoples eyes; and along the same vein, there are plenty of smart people who will tell you earnestly that a short O(n*n) algorithm is always better than a long winded O(nlog(n)) algorithm because the former is so much simpler, just make it parallel. In 100 years from now few if any will program anything in the sense we do, but I would bet that in 300 years from now there will still be a need for some computational intuition.


Displaying 1 comment