Computing Applications Departments

A Computational Lens on Economics

  1. Article
  2. Author
Former CACM Editor-in-Chief Moshe Y. Vardi

The COVID-19 pandemic is a dual crisis. On one hand, it is a global health crisis with millions of cases and hundreds of thousands of deaths. At the same time, decisions by individuals and governments in response to the pandemic have led to a severe economic slowdown, the likes of which has not seen since the Great Depression in the 20th century. But, as I wrote in a May 2020 column, economics can be argued to be one of the roots of this dual crisis. I quoted William Galston, who wrote: "What if the relentless pursuit of efficiency, which has dominated American business thinking for decades, has made the global economic system more vulnerable to shocks?" This relentless pursuit of efficiency prevented us from investing in getting ready for a pandemic, in spite of many warnings over the past several years, and pushed us to develop a global supply chain that is quite far from being resilient. Does computer science have anything to say about the relentless pursuit of economic efficiency? Quite a lot, actually.

Economic efficiency means goods and factors of production are distributed or allocated to their most valuable uses and waste is eliminated or minimized. Free-market advocates argue that through individual self-interest and freedom of production as well as consumption, economic efficiency is achieved and the best interest of society, as a whole, are fulfilled. But efficiency and optimality should not be conflated. A fundamental theorem in economics states that under certain assumptions a market will tend toward a competitive, Pareto-optimal equilibrium; that is, economic efficiency is achieved. But how well does such an equilibrium serve the best interest of society?

In 1999, Elias Koutsoupias and Christos Papadimitriou undertook to study the optimality of equilibria from a computational perspective. In the analysis of algorithms, we often compare the performance of two algorithms (for example, optimal vs. approximate or offline vs. online) by studying the ratio of their outcomes. Koutsoupias and Papadimitriou applied this perspective to the study of equilibria. They studied systems in which non-cooperative agents share a common resource, and proposed the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. This ratio has become known as the "Price of Anarchy," as it measures how far from optimal such non-cooperative systems can be. They showed that the price of anarchy can be arbitrarily high, depending on the complexity of the system. In other words, economic efficiency does not guarantee the best interests of society, as a whole, are fulfilled.

A few years later, Constantinos Daskalakis, Paul Goldberg, and Papadimitriou asked how long it takes until economic agents converge to an equilibrium. By studying the complexity of computing mixed Nash equilibria, they provide evidence that there are systems in which convergence to such equilibria can take an exceedingly long time. The implication of this result is that economic systems are very unlikely ever to be in an equilibrium, because the underlying variables, such as prices, supply, and demand are very likely to change while the systems are making their slow way toward convergence. In other words, economic equilibria, a central concept in economic theory, are mythical rather than real phenomena. This is not an argument against free markets, but it does oblige us to view them through a pragmatic, rather than ideological, lens.

But one does not need too sophisticated analysis to conclude that individual self-interest—expressed in an extreme form by the "Greed is good" speech in the 1987 movie Wall Street—does not necessarily lead to an optimal outcome. After all, every computer-science graduate has learned about "greedy algorithms," which make the locally optimal choice at each stage with the intent of finding a global optimum. While such algorithms sometime do yield a global optimum, they most typically do not. In fact, designing algorithms that do find global optima is a major topic of algorithmic research.

Our digital infrastructure, which has become a key component of the economic system in developed countries, is one of the few components that did not buckle under the stress of COVID-19. Indeed, last March many sectors of our economy switched in haste to the WFH mode, "working from home." This work from home, teach from home, and learn from home was enabled (to an imperfect degree, in many cases) by the Internet. From its very roots of the Arpanet in the 1960s, resilience, enabled by seemingly inefficient redundancy, was a prime design goal for the Internet. Resilience via redundancy is one of the great principles of computer science. Pay attention, economics!

Follow me on Facebook and Twitter.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More