Sign In

Communications of the ACM


What Came First, Math or Computing?

CACM Senior Editor Moshe Y. Vardi

One of the most fundamental conundrums in the philosophy of mathematics is the question of whether mathematics was discovered by humans or invented by them. On one hand, it seems hard to argue that highly sophisticated mathematical objects, such as inaccessible cardinals, were discovered. On the other hand, as Albert Einstein asked, "How can it be that mathematics, being after all a product of human thought, which is independent of experience, is so admirably appropriate to the objects of reality?" The 19th century mathematician Leopold Kronecker offered a compromise, saying "God created the integers, all else is the work of man."

So let us consider the natural numbers. The Lebombo Bone is a bone tool made of a baboon fibula with incised markings, discovered in a cave in the Lebombo Mountains in Africa. More than 40,000 years old, the bone is conjectured to be a tally stick, its 29 notches counting, perhaps, the days of the lunar phase. It has referred to as the oldest mathematical artifact. But should we not call it the oldest computing artifact? Counting is, after all, the most basic form of computing.

Deductive mathematics was discovered by the Greeks about 2,500 years ago, a discovery that has been called "The Greek Mystery." Pythagoras considered the proof of his theorem a "gift from the Gods." A deductive proof offers indisputable evidence—the high road to truth.

The Greeks also discovered the Liar's Paradox, where self-reference creates a statement whose truth or falsity is elusive. In the 19th century, Georg Cantor used self-reference to prove that there are infinitely many distinct infinities—a result that Kronecker dismissed with "There is no mathematics there." Bertrand Russel then used self-reference to show that set theory, considered the foundational theory of mathematics, is inconsistent, launching the so-called Foundational Crisis. A mathematical proof provides indisputable evidence of mathematical truth, but what constitutes a proof?

In response to the crisis, David Hilbert, the reigning mathematician during the first part of the 20th century, launched Hilbert's Program, which consisted of three legs. Hilbert aimed to show that mathematics is consistent (a mathematical statement and its negation cannot ever both be proved), mathematics is complete (all true mathematical statements can be proved), and mathematics is decidable (there is a mechanical way to decide whether a given mathematical statement is true or false).

In the 1930s, Kurt Gödel demolished the first two legs of Hilbert's Program, showing that arithmetic is not complete and its consistency cannot be proven in arithmetic. Shortly after that, Alonzo Church and Alan Turing demolished the third leg. They defined computability and showed that mathematical truth is not computable. This result could be understood to say that mathematics transcends computation.

Computer science, nevertheless, was born out of the ruins of Hilbert's Program: we got the notion of computability, the distinction between hardware and software, and the concept of a universal machine. In an amazing historical confluence, real computers were soon built: the Z3 by Zuse in 1941, the Atanasoff-Berry Computer (ABC) in 1942, and the ENIAC—the first digital, electronic, programmable computer—in 1946.

As the use of computing in science and business spread in the 1950s and 1960s, we soon discovered that being computable is not enough. Solving certain computational problems seems to require inordinate amounts of computational resources (time and memory). Certain problems seem to be amenable only to exhaustive search, which becomes impractical as problem instances grow. Computational complexity theory was developed to understand this phenomenon.

NP-completeness theory, which emerged in the early 1970s, aimed at explaining the difficulty of exhaustive search. Problems in NP are problems that have short solutions that can be checked efficiently. NP-complete problems are the hardest problems, in a formal sense, in NP. Boolean satisfiability, the very crux of deductive reasoning, was shown by Stephen Cook and Leonid Levin to be NP-complete. We still do not know, however, if NP-complete problems are truly intractable.

In 1979, building on NP-completeness theory, Cook and Robert Reckhow were finally able to answer the fundamental question of what a mathematical proof is, evidence that is so rigorous it can be checked computationally.

Mathematics does not transcend computation after all. Rather, computation is at the very heart of mathematics.

So, what came first, math or computing? Neither! They were both developed by humans as a way to reason about the physical world. Math and computing have been entwined for the past 40,000 years and will continue to be so.

Back to Top


Moshe Y. Vardi ( is University Professor and the Karen Ostrum George Distinguished Service Professor in Computational Engineering at Rice University, Houston, TX, USA. He is the former Editor-in-Chief of Communications.

Copyright held by author/owner.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.



Martin Wheatman

Dear Moshe,
You forgot to mention one important machine in your list of early computers: The Manchester Baby, the first electronic stored program computer, which ran on the 21st June, 1948. But why is this so important? It demonstrates Turings notion of Universality: the property that both program and data can be stored in memory, not merely programmed using switches on the front of a device.
With this omission, however, comes an important thread missing from your piece. Mathematics would be difficult without the invention of the clay tablet about 3500 BCE. This supported the logistics to sustain the city of Sumer in Mesopotamia. This developed into paper, the printing press and other machines which can use the printed word.
But where were we before the clay tablet? Were there cultures before writing? Presumably, the example of a 40,000 year old tally stick shows there were. What Im asking you to consider is, can an aural culture compute? If so, then perhaps speech is the Universal Machine.
But this is the thing, this is self-evident, if I can conditionally state this conjecture, then Im already reasoning with ordinary language. John Austin, in his William James Lecture at Harvard in 1955, laid out the idea of conditionality in speech, in his idea of felicity or the happiness of the outcome of a statement. Conditionality is one of the main ideas behind Turing Completeness. So if I can say, if so, , or if not, , I am reasoning in speech.
Similarly with loops, another idea behind Turing Completeness. If you can say, Do X with this, and Do X with these, you can also say, Do X with the first of these, and Do X with the rest of these, and you are processing a listcreating a loop, essentially what Kurt Godel was doing with recursive functions.
If youve made it this far, I hope we can agree that speech precedes writing, and that math is (at best) limited to simple arithmetic without writing. However, by successfully conveying an idea, I am essentially programming your mind, and speech is computational. Therefore, computation precedes math.
Many thanks for your thought provoking piece,
P.S. If you need a demonstration of this: of how speech contains both data and action, and is therefore a Universal, this can be demonstrated by the software at

Displaying 1 comment

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: