Opinion
Computer History Vardi's Insights

What Do Computing and Economics Have to Say to Each Other?

Humanity today has been fashioned by some very significant cultural revolutions: the development of language and writing, and the invention of print and of the telegraph. A culminating chapter in this 200,000-year information revolution is the development of computing and communication technologies in the 20th Century, where we learned to transmit as well as process  information.

Posted
CACM Senior Editor Moshe Y. Vardi

In July 2020, I wrotea about a computational perspective of economics. I described a 1999 result by Koutsoupias and Papadimitriou, regarding multi-agent systems. They studied systems in which non-cooperative agents share a common resource and proposed the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. This ratio has become known as the “Price of Anarchy,” as it measures how far from optimal such non-cooperative systems can be. They showed that the price of anarchy could be arbitrarily high, depending on the complexity of the system. The Price-of-Anarchy concept has later been extended to other types of equilibria—for example, Pareto-Optimal Equilibria.b

Price-of-Anarchy results debunk the key idea underlying market fundamentalism,c which is that “markets knows best.” This idea is rather tempting. Societies struggle in agreeing on the value of “things” due to multiplicity of opinions. Why not let the market arbitrate? The value of a “thing” in equilibrium must be the true value, they say. We know now, however, that markets have many equilibria, and the actual equilibrium reached is not necessarily the “best” one, so we should not take it as the arbiter of true value. There is no escaping the multiplicity-of-opinions problem.

What does economics say about computing? A decade ago, economist Robert Gordon talkedd about, “The death of innovation, the end of growth.” As it happens, the deep-learning revolution also launched about a decade ago. Current predictions estimatee the impact of artificial intelligence in terms of perhaps an additional USD13T of global GDP by 2030. Well, predictions are difficult, they say, especially about the future.

At the heart of economic pessimism about computing is the Productivity Paradox, due to the late economist Robert Solow, in reference to his 1987 quip, “You can see the computer age everywhere but in the productivity statistics.” As describedf by Erik Brynjolfsson, the paradox refers to the slowdown in productivity growth in the U.S. in the 1970s and 1980s despite rapid development in computing technology over the same period. This discrepancy between computing investment and lagging productivity growth is exactly what is at the root of Gordon’s pessimism. Nevertheless, Brynjolfsson himself took in 2021 the optimistic side of a betg with Gordon that productivity would surge in the coming decade, thanks to AI.

Yet, the analysis at the heart of the Productivity Paradox is weak, I believe. Economic productivity measures output per unit of input, such as labor, capital, or other resource. Productivity grows via myriad inventions and improvements spreading through the economy. Productivity growth in the U.S. from 1973 to 1995 was notably slower than the preceding two decades (1951-1972). To truly assess the impact of computing on productivity, however, one would have to compare productivity growth between 1973 and 1995 to a hypothetical 1973-1995 U.S. economy without innovation and investment in computing. In other words, to assess the value of computing technology to the economy, one must contemplate how the economy would have fared without it.

Of course, it is not possible to develop such a hypothetical model reliably, so such a comparison is not feasible. The point is that we do not really know what caused the slowdown in productivity growth from 1973 to 1995. In fact, the slowdown might have been worse without the investment in computing technology.

It seems intuitive to me that today’s complex economic world would simply be infeasible without computing technology. JoAnne Yates described the intimate connection between computing technology and economic complexity in her 1989 book, Control through Communication: The Rise of System in American Management. Modern technology has enhanced this trend to the point that Bruce Lindsay, a well-known IBM database researcher, has quipped, “relational databases form the bedrock of Western civilization.” Indeed, if a massive electromagnetic pulse wiped out our computing infrastructure, our society would face a catastrophic collapse.

Humanity today has been fashioned by some very significant cultural revolutions: the development of language about 200,000 years ago, the development of writing about 5,400 years ago, the invention of print in the 1440s, and the invention of the telegraph in the 1830s. Language, writing, print, and telegraph are means for transmitting information. A culminating chapter in this 200,000-year information revolution is the development of computing and communication technologies in the 20th Century, where we learned to transmit as well as process  information.

Computing is huge! Pay attention to optimistic economists.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More