There have been increasingly urgent warnings from Bill Joy (cofounder of Sun and principal developer of the Java programming language) and others on the impending dangers from emerging self-replicating technologies. The day is close at hand when it will be feasible to create designer genetically altered pathogens in college laboratories. After that, we’ll have to contend with self-replicating entities created through nanotechnology, the field devoted to manipulating matter on the scale of individual atoms. Although nanoengineered self-replicators are at least one, and probably more than two, decades away, the specter can be described as that of an unstoppable nonbiological cancer. Finally, if we manage to get past these first two perils, we’ll encounter robots whose intelligence will rival and ultimately exceed our own. Such robots may make great assistants, but who’s to say that we can count on them to remain reliably friendly to mere humans?
In 3001 the disparity of wealth is astronomical. Beyond the Eden parks, the Shangri-las, and the New Jerusalems, there are the 40 billion have-nots, recessed to the barren places of the planet. But there is plenty to eat for all, and there is a cure for all things physical; we will have to wait until the year 4001 for a cure for the spirit.
—Jean-François Podevin, illustrator
These dangers are real, but Joy’s call for relinquishing the pursuit of knowledge and the advance of technology in broad fields such as nanotechnology is not the answer. Technology has always been a double-edged sword. We don’t need to look any further than today’s technology for evidence. Take biotechnology. We have already seen substantial benefit: more effective AIDS treatments, human insulin, and many others. In the years ahead, we will see enormous gains in overcoming cancer and many other diseases, as well as greatly extending human longevity—all presumably positive developments (although even these are controversial). On the other hand, the means will soon exist in a routine biotechnology laboratory to create a bioengineered pathogen that could be more destructive than an atomic bomb.
If we imagine describing the dangers that exist today (enough nuclear explosive power to destroy all mammalian life, just for starters) to people who lived a couple of hundred years ago, they would think it mad to take such risks. On the other hand, how many people in the year 2000 would really want to go back to the short, brutish, disease-filled, poverty-stricken, disaster-prone lives that 99% of the human race struggled through a couple of centuries ago? We may romanticize the past, but up until fairly recently, most of humanity lived extremely fragile lives where one all too common misfortune spelled disaster. Substantial portions of our species still live in this precarious way, which is at least one reason to continue technological progress and the economic enhancement that accompanies it.
Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk these same technologies may someday be used for malevolent purposes?
People often go through three stages in examining the impact of future technology: awe and wonderment at its potential to overcome age-old problems, then a sense of dread at a new set of grave dangers that accompany these new technologies, followed by the realization the only viable and responsible path is to set a careful course that can realize the promise while managing the peril.
Bill Joy eloquently describes the plagues of centuries past, and how new self-replicating technologies, such as mutant bioengineered pathogens, and “nanobots” run amok, may bring back long-forgotten pestilence. It is also the case, which Joy acknowledges, that technological advances, such as antibiotics and improved sanitation, have freed us from the prevalence of such plagues. Suffering in the world continues and demands our steadfast attention. Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk these same technologies may someday be used for malevolent purposes? Having asked the rhetorical question, I realize there is a movement to do exactly that, but I think most people would agree that such broad-based relinquishment is not the answer.
The continued opportunity to alleviate human distress is one important motivation for continuing technological advancement. Also compelling are the apparent economic gains that will continue to hasten in the decades ahead. The continued acceleration of many intertwined technologies is roads paved with gold (I use the plural here because technology is clearly not a single path). In a competitive environment, it is an economic imperative to go down these roads. Relinquishing technological advancement would be economic suicide for individuals, companies, and nations.
Which brings us to the issue of relinquishment, which is Bill Joy’s most controversial recommendation and personal commitment. I do feel that relinquishment at the right level is part of a responsible and constructive response to these genuine perils. The issue, however, is exactly this: at what level are we to relinquish technology?
Ted Kaczynski (aka the Unabomber) would have us renounce all of it. This, in my view, is neither desirable nor feasible, and the futility of such a position is only underscored by the senselessness of Kaczynski’s deplorable tactics.
Another level would be to forego certain fields; for example, nanotechnology that might be regarded as too dangerous. But such sweeping strokes of relinquishment are also untenable. Nanotechnology is simply the inevitable end result of a persistent trend toward miniaturization that pervades all of technology. It is far from a single, centralized effort, but is being pursued by a myriad of projects with many diverse goals.
Most importantly, abandonment of broad areas of technology will only push them underground where development would continue unimpeded by ethics and regulation. In such a situation, it would be the less stable, less responsible practitioners (the terrorists, for example) who would have all the expertise.
I do think that relinquishment at the right level needs to be part of our ethical response to the dangers of 21st century technologies. One salient and constructive example of this is the proposed ethical guideline by the Foresight Institute. Founded by nanotechnology pioneer Eric Drexler, the guidelines ask that nanotechnologists agree to relinquish the development of physical entities that can self-replicate in a natural environment. Another is a ban on self-replicating physical entities that contain their own codes for self-replication. In what nanotechnologist Ralph Merkle calls the “Broadcast Architecture,” such entities would have to obtain such codes from a centralized secure server, which would guard against undesirable replication. The Broadcast Architecture is impossible in the biological world, which represents at least one way in which nanotechnology can be made safer than biotechnology.
Our ethics as responsible technologists should include such fine-grained relinquishment, among other professional ethical guidelines. Other protections will need to include oversight by regulatory bodies, the development of technology-specific immune responses, as well as computer-assisted surveillance by law enforcement organizations. Many people are not aware that our intelligence agencies already use advanced technologies such as automated word spotting to monitor a substantial flow of telephone conversations. As we go forward, balancing our cherished rights of privacy with our need to be protected from the malicious use of powerful 21st century technologies will be one of many profound challenges. This is one reason that the issue of an encryption trap door (in which law enforcement authorities would have access to otherwise secure information) has been so contentious.
As a test case, we can take a small measure of comfort from how we have dealt with one recent technological challenge. There exists today a new form of fully nonbiological self-replicating entity that didn’t exist just a few decades ago: the computer virus. When this form of destructive intruder first appeared, strong concerns were voiced that as they became more sophisticated, software pathogens had the potential to destroy the computer network medium they live in. Yet the immune system that evolved in response to this challenge has been largely effective. Although destructive self-replicating software entities do cause damage from time to time, the injury is but a small fraction of the benefit we receive from the computers and communication links that harbor them.
One might counter that computer viruses do not have the lethal potential of biological viruses or of destructive nanotechnology. Although true, this strengthens my observation. The fact that computer viruses are not usually deadly to humans only means that more people are willing to create and release them. It also means our response to the danger is much less intense. Conversely, when it comes to self-replicating entities that are potentially lethal on a large scale, our response on all levels will be vastly more serious.
Technology will remain a double-edged sword, and the story of the 21st century has not yet been written. It represents vast power to be used for all humankind’s purposes. We have no choice but to work hard to apply these quickening technologies to advance our human values, despite what often appears to be a lack of consensus on what those values should be.
Figure. In 3001 the disparity of wealth is astronomical. Beyond the Eden parks, the Shangri-las, and the New Jerusalems, there are the 40 billion have-nots, recessed to the barren places of the planet. But there is plenty to eat for all, and there is a cure for all things physical; we will have to wait until the year 4001 for a cure for the spirit.—Jean-François Podevin, illustrator —Jean-François Podevin