Sign In

Communications of the ACM

[email protected]

Algorithms Have Been Around for 4,000 Years

A basic concept of computer science is the algorithm. An algorithm can be described as:

  • instructions for solving a task; a method for solving a problem; calculation rule, or, more precisely,
  • a finite sequence of generally (valid), unique, executable instructions (steps).

The technical term is named after the Persian mathematician Muhammad Ibn Musa al-Khwarizmi, author of a work on calculation rules (who lived around 780 to 850 AD). Examples from everyday life are recipes, handicraft instructions, rules of the game, instructions for use, score, pattern.

Babylonian and Greek algorithms

The first known written algorithms were created around 2000 BC in Mesopotamia (see Donald E. Knuth; Luis Trabb Pardo: The early development of programming languages, in: Donald E. Knuth (ed.): Selected Papers on Computer Languages, Center for the Study of Language and Information, Stanford 2003, page 4, and Donald E. Knuth: Ancient Babylonian algorithms, in: Communications of the ACM, vol. 15, 1972, issue 7, pages 671-677, and vol. 19, 1976, issue 2, page 108).

The classical algorithms include

  • Sumerian-Babylonian root extraction (method of solving quadratic equations, Codex of Hammurabi (legal collection), c. 1700 BC),
  • the Euclidean algorithm (method of determining the largest common divisor, Euclid of Alexandria, 4th century B.C.),
  • the sieve of Eratosthenes (method for the determination of prime numbers, Eratosthenes of Cyrene, 3rd century BC),
  • the approximation of the circle number π (Archimedes of Syracuse, 3rd century B.C.), and,
  • the Heron procedure (Heron of Alexandria, 1st century AD).

A widespread calculation method is  recorded in the Papyrus Rhind (around 1550 B.C.): Egyptian multiplication. It corresponds to Ethiopian and Russian multiplication.

It was not until 2013 that Menso Folkerts found Jost Bürgi's "Kunstweg" (1592). For more than 400 years, Bürgi's method for calculating sine values had been a mysterious secret.

Digitization and artificial intelligence are nothing new.

Artificial intelligence is over 100 years old. Around 1912, the Spanish engineer Leonardo Torres y Quevedo built an operational chess machine, which has been preserved to this day. For a long time chess programs were an embodiment of machine intelligence. Alan Turing and Konrad Zuse should also be mentioned in this context.

Even the ancient abacus was digital. This also applies to the tally stick and the widespread finger calculation (Latin digitus = finger). Several stages of digitization can be distinguished. In the 17th century, Wilhelm Schickard, Blaise Pascal, and Gottfried Wilhelm Leibniz invented (digital) mechanical calculating machines. Since about 1850, the Thomas arithmometer, which is able to perform all four basic arithmetic operations, has been manufactured in large quantities, and at the end of the 19th century, the punched card machines appeared. In the 1940s and 1950s, relay and tube machines were introduced to the market, later transistor computers. For a long time there was competition between electronic analog and digital computers. A new wave of digitization began in the 1970s, when mechanical and electromechanical calculating machines and (analog) slide rules were replaced by (portable) electronic computers. It expanded thanks to microelectronics (from about 1970), the Internet (from about 1970) and the World Wide Web (from 1990).

By the way, the boundary between analog and digital is blurred, there are mixed forms. In the case of a digital watch, the time is represented by digits; in the case of an analog watch, by hands. The hour and minute hands move continuously, but the second hand moves gradually, jumping.


Herbert Bruderer: Milestones in computing technology. Vol. 1: Mechanical calculating machines, slide rules, historical automata and scientific instruments, 2nd, completely revised and greatly expanded edition, Walter de Gruyter GmbH, Berlin/Boston 2018, 751 pages (in German)

Herbert Bruderer: Milestones in computing technology. Vol. 2: Invention of the computer, electronic computers, developments in Germany, England and Switzerland, 2nd, completely revised and greatly expanded edition, Walter de Gruyter GmbH, Berlin/Boston 2018, 849 pages (in German)

Jochen Ziegenbalg; Oliver Ziegenbalg; Bernd Ziegenbalg: Algorithms from Hammurapi to Gödel, Springer Fachmedien, Wiesbaden, 4th, revised and extended edition 2016, pages 55-85 (in German).

Translated with (revised)

Meilensteine der Rechentechnik, vol. 1

Meilensteine der Rechentechnik, vol. 2


Herbert Bruderer is a retired lecturer in didactics of computer science at ETH Zürich. More recently he has been an historian of technology, and was co-organizer of the International Turing Conference at ETH Zürich in 2012.

[email protected], [email protected]


No entries found