ACM fellow Juris Hartmanis, recipient of the 1993 A.M. Turing Award with Richard E. Stearns, has made fundamental contributions to theoretical computer scienceparticularly in the area of computational complexityfor over 50 years. After earning a Ph.D. in Mathematics from Caltech, he developed the foundations of this new field first at General Electric Research Laboratory and then at Cornell University. He says "Computational complexity, the study of the quantitative laws that govern computation, is an essential part of the science base needed to guide, harness, and exploit the explosively growing computer technology."
Noted historian and Communications Viewpoints board member William Aspray conducted an extended oral interview of Hartmanis in his Cornell University office in July 2009. The complete transcript of this interview is available in the ACM Digital Library; presented here is a condensed and highly edited version designed to whet your appetite.
I was born on July 5, 1928 in Riga, the capital of Latvia, and was very fortunate to have been born into a prominent Latvian family. My father was the Chief of Staff of the Latvian army, and my early childhood was secure, pleasant, and interesting. I attended the excellent French Lycee there, expecting to follow my father into a military career. I am surprised how much motivation and insight they gave besides teaching the basic subject matters.
That good life unfortunately changed when I was about 12 years old. The Soviets occupied Latvia, and in the winter of 1940 my father was arrested. We did not know for years what happened to him. Only after the Soviet Union collapsed in the 1990s and their archives were opened did we find out that my father has been taken to Moscow, tried, convicted, and executed. The Soviet occupation really was very, very brutal.
When Riga fell to the Soviets in 1944, we left by ship and moved to the university town of Marburg an der Lahn in Germany, and I enrolled in a German high school. But this was 1944 and there wasn't much of a school year; every time allied bombers approached there were air raid alerts, and everybody had to proceed to air raid shelters. By 1945 everybody knew the war was lost, and German soldiers surrendered en masse in a disciplined manner.
When the war was over I attended Latvian high schools, first in the English occupation zone, then in the American zone. Travel was very difficult. There were bombed-out bridges where trains had passed over valleys; trains now stopped on one side and you carried your luggage down the valley and up the other side where another train was supposed to be waiting.
Germany at that time was really a sad place and life in general was difficult. But the schoolteachers were superb. They were highly educated and motivated, and often were professors from the Latvian university. I was inspired to continue my education. I studied physics for two-and-a-half years at the University in Marburg, and enjoyed it immensely.
These were hard times, sad times. Just about every institute was stripped of much of its equipment during the wartime. But there is a certain advantage to have to improvise and overcome difficulties. I did not possess any physics textbooksthey were very hard to find, and anyway I would not have had the money to buy themso I compensated by taking very detailed notes in lectures. I got good grades and strong recommendations from the professors.
Though I felt quite comfortable with the language and enjoyed my studies, Germany wasn't our country. Not only that, the outlook for a successful career and life in Germany did not look very promising. Among Latvians in Germany, the top preference was to emigrate to America or Canada. My mother had friends from Latvia who had arrived in the states earlier and lived in Independence, Missouri, so that's where we went.
After our arrival in Independence I worked for Gleaner Harvester Company, which built agricultural combines. When production was cut back and the new hires were let go, I went to work for Sheffield Steel in Kansas City as a steel worker. I even became a union member. It was interesting.
During the summer I went to the University of Kansas City. They gave me a credit of 128 units for my five semesters in Marburg, and decided that I had the equivalent of a bachelor's degree. They also gave me a scholarship and admitted me as a graduate student, which surprised me since I had studied in Marburg for only two-and-a-half years.
I was delighted to be accepted as a graduate student, but there was a problem: there was no graduate program in physics. There was a graduate program in mathematics, so they said, "You will be completely happy studying mathematics." And I was.
My mother and I lived with and worked for a very famous physician. I was a butler and served dinner every day. I washed his two cars: the 12-cylinder Lincoln and the 8-cylinder Cadillac. I drove his wife's old mother to church every Sunday. Things like that.
I was a straight-A student. The University of Kansas City at that time was in no sense an elite university, but it had a respectable mathematics department and it was a good place to study and improve my English.
After I received my master's degree at the University of Kansas City I applied to several universities, among them CalTech.
I was told that Caltech really was an elite school, one of the best schools in physics and mathematics. Not only that, Caltech gave me $90 to get there, and offered a teaching assistant's job. They read my situation very well. So my mother and I left Kansas City and drove my first car across the country to Caltech.
Caltech admitted me not really knowing what I wanted to do, after having first done physics in Germany and then mathematics in Kansas City. They said "You look like an applied mathematician. We do not have an applied mathematics track, but we believe you will be perfectly happy doing mathematics." Since I'd never had a course in applied mathematics I said "Okay," and my fate was decided.
Caltech was first-class. It just bubbled with creativity, with scientific work. Nobel Prize winners were running all around: Linus Pauling, Carl Anderson of cosmic rays, Millikan of the famous measurement of the electron charge with the oil drop experiment. Richard Feynman, who received the Nobel Prize in 1965, was one of the most popular of the teachers and scientists; physics graduate students just adored him. Sometime during my stay there, James Watson of DNA fame showed up. There was such intense research activity that you almost felt "I haven't done anything today. I should. Everybody else is producing great results."
I enjoyed my stay at CalTech very much. California was a great place, and life in Pasadena was exciting. There was Hollywood, the beaches, the mountains, the Rose Paradelife was just beautiful. I enjoyed it even more when I met Elly Rehwald, who is now my wife; just recently we celebrated our 50th wedding anniversary.
I do not recall exactly how I selected or was selected to work with Professor Robert Dilworth, an algebraist who worked in lattice theory. After I passed my qualifying exams Dilworth suggested I work on the partition lattice embedding problem: showing that partition lattices contain any other lattice as a sublattice.
Well, I worked on it. It was interesting, but it was hard. I realize now that the partition lattice embedding theorem was a first-class problem that after decades had not been solved.
I struggled with it, and finally said, "Let's think around it. Let's look at the terrain around the problem." I invented generalized partitions, and realized they behaved like lines in geometry. There is at least some intuition about lines and subspaces. I started developing techniques to solve the embedding problem for generalized partition lattices.
I did not keep a log, so I do not know exactly when I started working in this problem and when I solved it. My recollection is that it was a very intensive period of research. I gained real insights about the lattice of geometries and solved the embedding problem. I gave a seminar on my results that was very well received. Dilworth urged me to write up my results, and I was told that I had my thesis.
This research experience taught me a tremendous amount. First, how to do research; that thinking about the whole intellectual terrain around problems is important. It is like climbing mountains: there may be different ways to the top than the steepest slope.
"Caltech admitted me not really knowing what I wanted to do."
I also discovered, during my education and early research experience, that you do not have to know that much to do research. With a good background education, one can very quickly absorb the necessary specialized knowledge for a specific research area. I think what Caltech really taught me was to be independent, and to try to go to the essence of a problem.
During my last year at CalTech, professor Bob Walker from Cornell visited. After discussions with Dilworth and on his recommendation, my friend Johnny Johnson and I were, on the spot, given verbal offers to come to Cornell as instructors of mathematics. In those days, you did not start as an assistant professor; you started as an instructor. Without having seen Cornell, both of us accepted the offers.
The Cornell campus is charming. We were immensely taken by the beauty of the surrounding area and the campus itself. In the math department we were very junior members and started by teaching calculus. It was a very friendly environment. We easily made friends and really felt like apprentice professors.
Dick Shuey, the General Electric Research Lab manager in nearby Schenectady, was a very far-sighted person who had convinced GE that they should become a great computer company, and that a science base is needed for that. He traveled around finding people to do research in "information," which we quickly translated as computer science. He was immensely impressed by Shannon's work, and so was I. Shuey visited me in the math department and convinced me to take a summer job.
The research traditions there are old, and the laboratory has a string of successes. The information section was still quite small, and so any newcomer was very welcome. By the end of the summer I published a paper on linear coding networks in the IRE (later IEEE) Transactions on Circuits. The experience was very encouragingthat in such a short time I managed to produce a refereed paper that was acceptedand it opened up some new areas for me to think about.
That period was almost ideal. The computer science research people were quite free; I never had any explicit orders to work on particular problems.
Under Shuey's guidance, one of the very early papers I studied was Shannon and Weaver's book on information theory, Mathematical Theory of Communication. It impressed me immensely that from what seemed to me vague concepts such as channel capacity and coding for error-correcting codes, one could make a mathematical theory that guides you in how to communicate and gives you quantitative measures of what is the best you can do with a certain channel and with a certain information source. Unfortunately, in my attempts to apply it to computing there was no success. I played with information theoretic concepts, and I wrote a paper about entropy, but that was exploratory work that did not lead to very much.
Our job was just research, full-time, with no teaching obligations. We read whatever we could lay our hands on that said something about computing.
At Caltech I had not been exposed to concepts of computability, undecidabilty, or Turing machines. At the lab we read all those classic papers, and I realized how absolutely beautiful those ideas are. I had large photographs of Shannon, Turing, and von Neumann in my office; those were the people I admired and thought to be the pioneers in computer science.
Dick Stearns and I started working together during the summer and hit it off really well. When he finished his Ph.D. and joined the research lab as a research mathematician, we worked day in and day out together, sitting and staring at each other in his office or mine, shouting and hollering about the other's ignorance for not understanding the subtlest points of computability. We did a lot of work on finite automata, particularly in decomposing finite automata into smaller ones from which they could be realized. In 1966 we published a book summarizing our work, Algebraic Structure Theory of Sequential Machines.
When one looks at the early years of theoretical work in computer science, there was a lot of switching theory, which really was how to design circuits. We worked on code assignment, optimal naming of finite automata states, and related problems. But soon the research moved away from finite devices and started exploring formal languages, push-down automata, Turing machines, design and analysis of algorithms, computational complexity, and so forth. This was a turn to more realistic models and problems about real computation.
In the very early 1960s, when we had really well understood Turing's work, we read a paper by Yamada on real-time computation. Yamada was interested in computing when the Turing machine had to print the digits at a steady pace and could not slow down. He showed quite easily that there were recursive sequences that could not be so computed.
Yamada studied just this one class with a time bound. But we thought there should be a theory about all complexity classes, and that every computation should fall in some class. We started doing generalized computational complexity. Among the key ideas was the realization that every solution to a class of problems should have a computation bound as the problem grows in size. We simply defined computational complexity classes by a function, like say n3, that encompassed all problems that can be solved in n3 stepswhich we referred to as timefor problem size n. We proved a general hierarchy theorem that showed that for nice bounds there are computations that can be done exactly in the given bound and not within a smaller bound. We also showed that for any given computable bound there were problems not computable in the given bound. Today this approach is known as asymptotic complexity.
In April 1963 we finished the paper that later earned us the Turing Award, on the computational complexity of algorithms. We originally submitted it to the Journal of the ACM, but there was lots of mathematics and we worried that not too many people would care to study it. So we published it in the Transactions of the American Mathematical Society instead. The first public presentation was in 1964 at the IEEE Annual Symposium on Switching Theory and Logical Design, and it was very well received. We were confident that we had opened a new and important research area, and we were totally committed to exploring computational complexity.
We started exploring other complexity measures besides time. We studied, along with Phil Lewis, tape and memory bounded computational complexity classes, now also known as space-bounded complexity classes. We proposed a new Turing machine model that had a read-only two-way input tape and a separate working tape for the computation.
Many computer scientists joined this research area. For example, our context-free language recognition (log n)2 algorithm led Savage to generalize it and prove a beautiful relation between deterministic and nondeterministic tape bounded computations. The work in complexity theory started spreading.
"When I joined, GE had all the things in place to become a dominant computer designer and manufacturer."
As the group of people working in this area grew, we felt a need for a conference and publication dedicated to computational complexity. In 1986 we organized the first conference, "Structure in Complexity Theory""structure" because we were interested in how all these complexity classes relate to each other.
It is quite amazing that in spite of the 45 years of impressive progress understanding the complexity of computations, we still cannot prove that P is different from NP and NP is different from PSPACE, the problems that can be solved with a polynomial-length tape. Intuitively, all these classes look to be different, but no proof has been found. Until we settle the separation problem for these classes, particularly Cook's notorious P versus NP problem, we have not fully understood complexity of computations. But I do believe that they are provable, and will be solved.
When I joined, GE had all the things in place to become a dominant computer designer and manufacturer. We hoped that our work would eventually be the foundation for, or help with, the computer business. But GE failed to exploit these early successes and their computer effort was fumbled away. GE had the philosophy that a good manager can manage anything, and that was proved absolutely wrong in the computer field. It was a great failure and a great disillusionment for my colleagues and me.
I had really enjoyed Cornell before, so in 1965 I accepted their offer to become a full professor with tenure, and to be chair of their new graduate department in computer science. There were a number of other computer science departments that were being formed or planned, and the excitement in computer science almost perceptibly had shifted toward education.
Creating the Cornell Department of Computer Science
Our greatest problem was finding faculty. The most important part is to get the best possible people you can. Quality, quality, quality.
We decided to start with a very light teaching load: one course per semester plus participation in a seminar. Not only that, our Sloan Foundation support allowed us to pay better salaries than were paid for the same ranks in the math department and some other computer science departments. We created a very strong department, although we were really overloaded in theory.
We worked very hard on the intellectual environment, the cohesiveness of the department. I decided not to have regular faculty meetings. We all met at the Statler Club, and after a quick lunch, we gathered in a corner niche of the Cornell Faculty Club for coffee and discussions. All departmental issues were discussed in these meetings. I, as chair, listened and argued and listened some more. If I felt a consensus emerging, I said so, and then I implemented the decisions. From the very beginning we had the tradition that everybody is to be heard.
Other Jobs: Pondering the Future of Computer Science
What is computer science and engineering? What should the field be doing? What does the field need in order to prosper? That was the mandate for "Computing the Future," a 1992 study by the Computer Science and Telecommunications Board of the National Academies.
We did argue. It is amazing when you put a bunch of bright, successful, good scientists in one room and ask them, "Tell me what computer science is." A very lovely, lively discussion.
We made recommendations: sustain the core effort in computer science, improve undergraduate education, and broaden the scope of computer science. This meant broadening interdisciplinary work as well as expanding in new applications; we wanted to see every computer science Ph.D. program require a minor in some other discipline.
I think it is a good report, but there was some controversy. John McCarthy from Stanford University got very upset; he felt that AI did not have the prominent role it should have. The controversy died down, but I think it did bring some more publicity to the report. A number of people have said they were delighted that it fought for a broader agenda. At the end the report was well received by the CS community.
In 1996 I went to NSF as Assistant Director of the Directorate for Computer and Information Science and Engineering, CISE. Absolutely loved Washington, and loved the job. It was a kind of twofold job. One, run CISE: worry about the distribution of funds, and what areas to support. I felt strongly that the program managers just did not exercise enough of their power in making their own decisions about what will or will not get funded. They really have a fair amount of power to do that. Complaints can be leveled that the program managers do not do enough risky research support and enough interdisciplinary funding. But, for example, I closed down an engineering program that was supporting university chip design, because by that time there were other venues.
The other job is to represent computer science and represent CISE within NSF, where you compete with your fellow assistant directors for funding. You spend a lot of time telling, in short and long paragraphs, about computer science. You basically are a spokesman for computer science.
At CISE I did a thing that had not been done before ever: I reviewed every bloody program. I sat down individually with every program manager and they explained what the program was doing, what research it was supporting. I told all my program and division managers, "I need nuggets. I need crisply explainable research results."
CISE ratings were lower than a number of other directorates on the average. I think computer scientists probably still do not have as uniform an understanding of each other as the physicists do, for example. Physicists will argue about different projects, but I think they have a firmer way of judging each other's research proposals. Computer science is new. That's no real excuse, but it's growing fast, it's changing.
I stayed a little bit over two years. Within NSF it's a very delicate relationship, how hard you try to convince the director that you should get the biggest slice. CISE did quite well, but when CISE started it was very small and its budget had to be built up to meet the expanding CS needs. That was not an easy process, to argue for the recognition of computer science in a world dominated by physicists.
We were concerned, and we still should be concerned, that we are not attracting enough women. We are losing out on talent. We were puzzled, and we put extra money, while I was there, in fellowships for women. I am still surprised that there are more women in mathematics, percentage wise, than in computer science.
I was once asked, "Which of your two research achievements (not my words, his words) do you think is more important: the structure of finite automata, or complexity theory?" Without a second's thought I said, "Complexity."
Finite automata were fun. With Dick Stearns, some parts were just like playing a big, interesting game. There were some very unintuitive results. A number of people joined in that kind of fun, but I think that is more of parlor-type mathematics.
But almost all computer science problems involve algorithmsand therefore computational complexity problems. For example, computational complexity theory has put cryptography on a different base. I think every computer scientist should have some intuitive grasp of computational complexity.
Recently there have been very interesting results about the complexity of approximate results to NP and other problems that we believe not to be feasible computable. Many of these problems have easily computable approximations, but many others do not. In design of algorithms it has helped tremendously in knowing what can and cannot be done. In some cases we do not have that insight yet, but in other cases it helped, and better algorithms have emerged.
These are deep results and reveal very interesting connections between different aspects of computation, although there are important unsolved problems that have been around for quite a long time.