If there was ever a creator of wealth on a fantastic scale, ever a changer of custom and social values, ever a determinant of where our culture is headed and why, it's Moore's Law. Gordon Moore didn't so much invent his law as observe itthe apparently inexorable increase in the number of transistors that could be etched on the same silicon wafer. Moore's Law rule of thumb says computers will either double in power at the same price or halve in cost for the same power every 18 months. It's an insidious effect, one that has increased silicon density by a billion times in the last 50 years, and in the process placed more computing power in my wristwatch than was used to win World War II. But now some people predict Moore's Law is in danger of being repealed.
The technical arguments have to do with the photolithography process used to make silicon chips and how the lines etched using this process are becoming thinner than the wavelengths of light used to create them. Plotting into the future, that curve representing silicon density just might hit a wall, we're told, around 2008 or 2010. Well, this certainly isn't Silicon Valley thinking at work, since the whole essence of what that place is about is finding clever cheats to get around seemingly insurmountable problems such as this.
Though I haven't any idea how Moore's Law will be granted a life extension, I'm sure it will be, though I'm also sure it will come at a cost.
But what if the naysayers are correct? What if Moore's Law is suddenly no longer in effect, or more correctly, is no longer in effect for those parts of society unwilling to pay $1,300 for a toilet seat or $900 for a hammer? I think it would be a wonderful thing.
What if Moore's Law is suddenly no longer in effect, or more correctly, is no longer in effect for those parts of society unwilling to pay $1,300 for a toilet seat or $900 for a hammer? I think it would be a wonderful thing.
Of course this makes me a heretic, a Luddite, a guy who would apparently turn his back on the very source of stock options and human genomes and Gameboys that have so come to define our modern life. But that's not my point at all. I just think there is much to be gained from slowing down.
For just one example, look at the role of computers in schools. A textbook has a useful life of 10 years, a desk lasts 25, but a PC is scrap in three years. Which of these things costs the most? No wonder we are unable to put computers to good use in schools: the economics simply don't work. But what if Moore's Law did fade from the land and suddenly a PC could labor away for 25 years? Then every child would have a desk and a computer. Perhaps the desk would be the computer. And would technical innovation cease? No. Haiku is limited to 17 syllables, yet still there are poets.
We have become so used to the idea of throwing away our programming efforts every year or two that "long term" has been redefined into absurdity. If Moore's Law was gone, all that would change and we might find ourselves building data structures of enduring quality. I'm not saying we'll return to the age of building great cathedrals, but at least we'll have the right building materials should we decide to.
Figure. Ghostly reflections in the Pleiades star cluster. Actually an interstellar cloud in the process of destruction by strong radiation from a nearby hot star, as snapped by the Hubble telescope. George Herbig and Theodore Simon, Institute for Astronomy, University of Hawaii. Courtesy NASA and The Hubble Heritage Team (The Space Telescope Science Institute operated by the Association of Universities for Research in Astronomy, Inc. for NASA, under contract with the Goddard Space Flight Center, Greenbelt, MD). The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2001 ACM, Inc.