Esther Dyson argued that as the world will never be perfect, whether online or offline, it is foolish to expect higher standards on the Internet than we accept in real life. Legislators are now turning this argument around, and arguing they have to restrict traditional offline freedoms in order to regulate cyberspace.
A shocking example is an export-control bill currently in Britain’s parliament. The government version would enable the government to impose licensing restrictions on collaborations between scientists in the U.K. and elsewhere; to take powers to review and suppress scientific papers prior to publication; and even to license foreign students in British universities. By a large majority, the House of Lords amended it to exclude material in the public domain and information exchanged in the normal course of academic teaching and research. It has now gone back to the House of Commons, where ministers say they will amend it back again. This fight could go on for weeks.
During the late 1990s, arms-export regulations prevented U.S. nationals from making cryptographic software available on their Web pages, or emailing it abroad. Phil Zimmermann, the author of PGP, was investigated by a grand jury for letting it "escape" to the Internet. The law was ridiculed by students wearing T-shirts printed with encryption source code ("Warning: this T-shirt is a munition!"), and challenged in the courts as an affront to free speech.
For government, there was a risk that crypto software would escape. For liberty, there was a risk we ignored at the time: that the bad policy would escape. Although the Clinton administration later abandoned its approach as unworkable, that did not stop other governments from trying to ape it. After Tony Blair was elected in 1997, he tried to take Britain down the American path; after much protest and many battles, the current bill is the result. His attempt to have a law with no embarrassing loopholes has resulted in one that is absolutely draconian. For example, someone accidentally learning the wrong type of secret can be prevented from ever leaving the U.K. (The Lords amended the bill to remove this unpleasantness; the government says it will reinstate it.) While this particular fight is mainly a matter for Brits, it is an example of a wider and worrying trend—toxic overspill from attempts to regulate the Internet.
There are many more examples. In the U.S., Hollywood’s anxiety about digital copying led to the Digital Millennium Copyright Act. This gives special status to mechanisms enforcing copyright claims: circumvention is now an offense. So manufacturers are now bundling copyright protection with even more objectionable mechanisms, such as accessory control. For example, one games-console manufacturer builds into its memory cartridges a chip that performs some copyright control functions but whose main purpose appears to be preventing other manufacturers from producing compatible devices. There is no obvious way to reconcile the tension between competition and public policies on copyright.
Meanwhile, worries about cybercrime are leading to a Europe-wide arrest warrant that overturns the time-honored principle of dual criminality, that you can be extradited from one country to another only if there is prima facie evidence you’ve committed a crime according to the laws of both countries. Currently, Germany has strict hate-speech laws (Mein Kampf is a banned book), while Britain does not. Right now, I could put an excerpt from that book on my Web site in the U.K. (or the U.S.) but not in Germany. But the new arrest warrant would allow the German police to extradite me from Britain, for an offense that doesn’t exist in British law. Thus, free-speech rights may be reduced to the lowest common denominator among the signatory nations.
Why do we get so many bad laws about information? Many of them have to do (in some broad sense) with risks: with the perceived vulnerability of the Internet to hackers, credit-card thieves, and other undesirables. There is massive hype from the computer security industry; when people get fed up with hearing about hackers, the risk changes to "cyberterrorism." There are few or no balancing voices, as the interests of almost everyone involved in the security industry (vendors, government agencies, regulators, researchers) lie in talking up the threats. Journalists prefer the scare stories to the rebuttals; politicians and bureaucrats use them to build empires. After the dot-com boom, we are seeing the dot-gov version—and there’s no sign of it bursting any time soon.
We need better ways of dealing with risks realistically at the political level. Does that mean simply educating the public about risks, or do we need something else too?
Join the Discussion (0)
Become a Member or Sign In to Post a Comment