When he first came up with the idea for the World Wide Web in 1989, Sir Tim Berners-Lee had trouble getting people to grasp the concept. If he gave a lecture to a room of 100 people, demonstrating how his browser/editor could jump from one document to another when he clicked on a hypertext link, he recalls, the response would be a collective "So what?"
"Maybe two or three at the back would get it. Most people wouldn't," says Berners-Lee.
Hypertext was not new. CD-ROMs had links that allowed navigation from one page of, say, an encyclopedia to another, but "people didn't understand the power of the link if it could link to everything conceivable," he says. "That's a paradigm shift, that if you click on it, it can go to anything on the planet."
It is for creating that paradigm shift—by inventing the World Wide Web, the URL naming scheme, the HTTP protocol, and the HTML markup language—that Berners-Lee has been designated to receive the 2016 Turing Award in June, the 50th time the prize will have been bestowed.
The Web has become so fundamental, he says, that it has become as difficult to imagine the Web did not exist, as it once was to imagine that it should. "The paradigm shift is impenetrable both ways," Berners-Lee says. "It was impossible to explain to people what the Web would be like then, and now when you talk to millennials they can't understand what the problem was."
Berners-Lee was working at CERN, the European Organization for Nuclear Research, in Geneva at the time. He had an undergraduate degree in physics from The Queen's College, Oxford, but no formal training as a computer scientist, although he had built his own computer and written software, and it was that combination of skills CERN needed.
He created the Web, in large part, to make his own life easier. There were perhaps 10,000 people working for CERN at the time, he says, but only about 3,000 on the actual campus; others were coming and going between there and other institutions. Berners-Lee thought it would be useful to have an online collaborative space where people could share ideas, and where people who came along later could follow the decision-making process by clicking through the links.
However, just bringing co-workers together did not seem like enough. The Web, he believed, should allow anybody anywhere to create information and link to it.
By 1989, the Internet was beginning to become generally connected, and Berners-Lee felt that linking everything to everything would spur users' creativity. "I'd been harping on aboutjoining all information together for ages," he says. "What was critical at that point was that my boss finally let me just do it as a side-project."
That boss, Mike Sendall, could not justify the project as having a direct relation to CERN's goals. Instead, he decided it could be a good way of testing the potential of the NeXT machine, a new computer architecture designed by Steve Jobs.
It has become as difficult to imagine the Web did not exist, as it once was to imagine that it should.
There were many types of computer systems in existence at the time—documentation systems, help systems, note-taking systems, paper-publishing systems—but each tended to focus on just one type of information. Berners-Lee wanted to break down the separation between those systems: "I realized the Web had to be universal. It had to be completely without attitude about what you were doing with it." It also had to work no matter the type of computer, the programming language, or the language or culture of the user.
His one requirement was that everybody in the world label everything they had with what he originally called a UDI, or universal document identifier. That would later become known as a uniform resource locator (URL), and is now becoming a URI, uniform resource identifier. "That's a very big 'ask,' so you can't ask anything else," he says.
Having made that "ask," he then set out to make everything else about his system easy to swallow, which led to a set of fairly arbitrary design choices as he was creating Hypertext Markup Language, such as deciding whether to use round or square brackets and which type of slash to use.
"Whether those slashes were forward slashes or back slashes didn't affect how the Web worked," he says, "but it does affect how other developers react to it, so the trick was to use design languages that they already used."
Berners-Lee made Hypertext Transfer Protocol (HTTP) look like other Internet protocols, such as Simple Mail Transfer Protocol (SMPT) and Network News Transfer Protocol (NNTP). He designed Hypertext Markup Language (HTML) to look like Standard Generalized Markup Language (SGML). He felt it would be more logical if domain names were listed in descending order of hierarchy-org.acm.cacm, for instance—but the Domain Name System already existed, so he took it as it was.
In 1990, Paolo Palazzi, a physicist at CERN, tried to tempt Berners-Lee into joining his Programming Techniques Group. Instead, Berners-Lee showed him the Web project he was working on. Palazzi says he saw the problem Berners-Lee was trying to solve, but did not fully understand his proposed solution. Nonetheless, he thought the project was worthy of support. "Truly innovative ideas cannot really be grasped, so you have to trust the person proposing it," he says.
Berners-Lee was someone he trusted. "He had a special way of going about solving problems," Palazzi says. "Tim is one of this class of people who have peculiar abilities at inventing or discovering."
It was the way he combined existing ideas—hypertext and Internet proto-cols—that was innovative, Palazzi says. HTML, HTTP, and URIs were inventive in themselves, but, Palazzi says, "The way they combine together is, I think, a stroke of genius."
By 1994, the Web had progressed from a small research project to a global phenomenon, with companies such as IBM adopting it and new companies such as Netscape, the first browser company, being created. Berners-Lee moved to the computer science department at the Massachusetts Institute of Technology (MIT) and founded the World Wide Web Consortium, an international group that developed standards for the Web. In 2008, he was named the 3Com Founders Professor of Engineering at MIT. In 2016, he became a professor in the computer science department at Oxford University, although he is also still working at MIT.
The Turing Award is the latest in a long list of honors recognizing the work of Berners-Lee. In 2004, he was knighted by Queen Elizabeth, who also awarded him the Order of Merit in 2007. He is a fellow or member of many professional organizations, including the Royal Society and the National Academy of Sciences, and has been given medals by groups ranging from the Institute of Physics (IOP) to the United Nations Educational, Scientific, and Cultural Organization (UNESCO).
The World Wide Web Foundation "was about recognizing that there was a duty that the haves have to the have-nots, to try and get as many people to have as much access as possible."
The Turing Award includes a prize of $1 million, with financial support provided by Google. Berners-Lee has not yet made plans for what he will do with that sum.
In 2008, Berners-Lee founded the World Wide Web Foundation, a nonprofit organization promoting access to the Web for all. "That was about recognizing that there was a duty that the haves have to the have-nots, to try and get as many people to have as much access as possible," he says. The Alliance for Affordable Internet, a project of the Foundation, seeks to drive down the price of broadband access so people in developing countries can access the Web.
Open access continues to be an issue for the Web, Berners-Lee says. "Whenever people ask me the question, 'What's your biggest fear?', it's always been that some one entity, either commercial or political, should control the Web. That would be the death of it." Net neutrality is important, he says; a service provider should not be able to control what content its customers see.
He also worries about governments either blocking access to certain Websites, or worse, using the Web to track which websites users visit, then punishing people based on that information.
"It's also something which countries like America and the U.K. have to be very careful with, make sure they don't slip into the fear of terrorism. The war on terrorism is being used as an excuse for many things, but one of them can be taking away people's fundamental rights to communicate." He cites recent calls by the U.K. government, in the wake of the terror attack in London in March, to be given backdoor access to applications such as Whatsapp for fear terrorists are using them to coordinate attacks.
At MIT, Berners-Lee is co-director of the Decentralized Information Group, which is working, in his words, on "redecentralizing" the Web to burst some of the "filter bubbles" people have created for themselves on social media.
"Because the Internet didn't have countries as a thing, [some people] hoped that people would learn to just break down cultural barriers and it would lead to love and understanding across borders, and world peace. And it didn't," he says. The group is pursuing the notion that perhaps the right software could help realize that utopian vision.
Another project Berners-Lee currently is working on seeks to give people greater control over their data, such as where it is stored and what other people and applications have access to it.
For young computer scientists looking to have an impact on the world, Berners-Lee recommends ignoring conventional wisdom and following their own instincts.
"You should feel free to develop something for yourself, because it seems to appeal to you, scratches an itch that you have," he says. "To a certain extent, one has to beware of asking the users what they want, because the things which they would find really exciting, they can't imagine."
©2017 ACM 0001-0782/17/06
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from firstname.lastname@example.org or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.
No entries found