News
Architecture and Hardware News

Rewarded for RISC

ACM A.M. Turing Award recipients David Patterson and John Hennessy developed the "dangerous" idea that software should be simpler so it can be executed more quickly, which evolved into the Reduced Instruction Set Computer architecture.
Posted
John Hennessy and David Patterson
  1. Article
  2. Author
John Hennessy and David Patterson

It was the early 1980s, and microprocessors were making the transition from laboratory curiosity to commercial product. To make them work, many computer scientists were trying to copy the same complex instructions used in mainframe computers. In fact, some wanted to expand those instructions, trying to get the buggy software of the era to work better.

However, two young professors had a different suggestion. “John and I come along and say absolutely the opposite. Not only should we not make it more complicated, we should make it even simpler,” says David Patterson, who at the time was an assistant professor of computer science at the University of California, Berkeley. “We weren’t just criticizing the trend. We were making an argument that people thought was dangerous, and was just going to make software fail more.”

That “dangerous” idea promoted by Patterson and John Hennessy, then an assistant professor of computer science at Stanford University, was the Reduced Instruction Set Computer (RISC) architecture, which relied on a simpler collection of general functions the processor would perform, shrinking the number of transistors needed to carry out a task. Today, 99% of the more than 16 billion microprocessors produced each year are based on RISC architecture, powering smartphones, tablets, and the Internet of Things, and earning Hennessy and Patterson this year’s 2017 ACM A.M. Turing Award.

It wasn’t that the number of instructions was necessarily smaller, Hennessy says; it was that they were less complex. “We said it’s not how big the program is, it’s how fast the program runs,” he says. Those early machines had perhaps 25% more instructions under RISC design, but ran them five times as fast. The advantage of the approach was so great, Patterson says, that processors designed that way could outperform those from talented designers using older methods. “The Berkeley RISC and Stanford MIPS processors designed by grad students were probably better microprocessors than what Intel could build,” he says. In fact, Patterson’s first prototype, which had 44,000 transistors, outperformed a 100,000-transistor device made the conventional way.


“We said it’s not how big the program is, it’s how fast the program runs,” Hennessy recalls.


MIPS—Microprocessor without Interlocked Pipeline Stages—was Hennessy’s RISC architecture, which sped up processing by using instructions loaded from the memory into a register, which could be accessed faster. Hennessy founded a semiconductor design company, MIPS Computer Systems, in 1984 to commercialize the technology, spurred in part by doubts from the computer industry that the approach would work in the real world. Around the same time, Bill Joy of Sun Microsystems in Santa Clara, CA, became interested in RISC, and brought Patterson on as a consultant.

Both projects were successful, which won over the skeptics. “When people start making money, it’s hard to argue that it doesn’t work,” Patterson says.

The men, of course, were academics as well as processor designers, and they were unsatisfied with how computer architecture was taught. “We were really unhappy with the quality of textbooks,” says Hennessy. “They were descriptive and comparative, but not in a numeric and quantitative fashion.” So they developed ways to measure whether a new processor design was an improvement, using metrics such as cost and speed. In 1989, they published their textbook, Computer Architecture: A Quantitative Approach. A year later, they published a version for undergraduates. Both are now in their sixth edition.

Garth Gibson, a professor of computer science at Carnegie Mellon University in Pittsburgh, PA, was a student in Patterson’s lab in the mid-1980s. “The book became a very good way for a lot of people to understand how computer architecture works and do a very good job at it,” Gibson says.

RISC has evolved as well, though many of the basics remain the same. The current version is RISC-V, an open source instruction set architecture allowing designers to add extensions that optimize a particular application. Such domain-specific architectures are the only option for improving performance as the benefits of Moore’s Law start to fade. “The circuits aren’t getting faster any more, so you’re going to have to change the instruction set architecture,” Patterson says with an air of excitement.

The impending end of Moore’s Law, and the demise of Dennard scaling, which said that as transistors get smaller their power density stays constant, are imposing new limitations on architecture, Hennessy says, and that makes RISC even more important. “Before, when transistors were doubling every two years or so, if you waste a few transistors, who cares? But when they’re not going up that fast, then efficiency becomes very important,” he says.

Patterson officially retired from Berkeley in 2016, although he still works there part-time. He also works at Google, developing domain-specific architectures for machine learning. Hennessy was president of Stanford from 2000 to 2016, and now directs the Knight-Hennessy Scholars program, which provides full scholarships to Stanford to graduate students from around the world. The program has just accepted its first class of 49 students from 16 countries.

Hennessy says the quantitative and engineering approach he brought to computer architecture served him well as president when the financial crisis hit in 2008 and Stanford, like other schools, lost a quarter of its endowment. He decided the university had to cut its budget and do layoffs in one fell swoop, rather than parceling out the pain over time as many other institutions did. The next year its finances had stabilized, and Stanford was able to hire faculty and recruit new graduate students again. “We had one harrowing year, and then things were better,” he says.

The men will share the Turing Award’s $1-million prize, supported by Google. Hennessy, who says he’s been fortunate in life, plans to donate his half to charity. Patterson, with several children and grandchildren who will need to pay college tuition, says he’ll invest his in education as a consumer.

The two came to computing along different paths. Patterson stumbled into it; during his senior year as a math major at the University of California, Los Angeles, a math class he meant to take was cancelled, so instead he took a course in programming with Fortran. The experience grabbed him, and he went on to earn an M.S., then a Ph.D., from UCLA in computer science. He applied for a job at Berkeley and, when he had not heard a response, his wife urged him to call and ask, which got him an interview and, eventually, a job.

Hennessy was interested in computers in high school, and as a science project built a machine that used relays to play tic-tac-toe. “It both won me a prize and it also helped me win my wife, because it impressed her family sufficiently that they thought ‘well, maybe this guy’s going to be okay’,” he says. He earned a B.S. in electrical engineering from Villanova University, and an M.S. and Ph.D. in computer science from the State University of New York at Stony Brook.

Both encourage young computer scientists to take risks as they did. “It wasn’t clear that that was the safest path to tenure, to rock conventional wisdom, but we believed in what we were doing, and that worked out pretty well,” Patterson says.

Hennessy agrees, “You have to be a little fearless, willing to take some chances and work on things that are a little contrarian.”

uf1.jpg
Figure. Watch Patterson and Hennessy discuss their work in this exclusive Communications video. https://cacm.acm.org/videos/2017-acm-turing-award

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More