Opinion
Computing Applications

Does Innovation Create or Destroy Jobs?

Posted
  1. Article
  2. Author
  3. Footnotes
Vinton G. Cerf
ACM Past President and Google Inc. Vice President and Chief Internet Evangelist Vinton G. Cerf

Over the past year, David Nordfors and I have organized several invitation-only seminars we call "Innovation and Jobs." Our purpose has been to draw upon many points of view from experts in many fields to understand more deeply how innovation relates to the workplace. One of the first surprises, for me at least, was the observation that, once there is food on the table and a roof over one’s head, everyone is not necessarily looking for remunerative work. What seemed very important was meaningful work. As this thread was teased out, we recognized that a significant fraction of some economies depends on or benefits from a lot of volunteer work. There are even websites devoted to connecting volunteers with work they find meaningful, such as the very successful www.volunteermatch.org. One wonders how much of the world’s economy involves this kind of non-remunerative work and to what degree we are dependent as a society on the gratifying sense of having contributed to the well-being of others or satisfying an itch that happens to produce benefits for others (think of volunteer do-cents in museums, volunteer nature walk guides, and people who volunteer in hospitals).

In this column, I would ask you read jobs in the most general sense as work that may or may not involve conventional remuneration (that is, pay).

As the title asks, do we know whether innovation creates or destroys jobs? The answer is yes to both aspects. Novel ways to do things, especially with forms of automatic production, clearly take away the need for manual jobs. The Jacquard looma is a perfect example. But it also created work. Someone had to design the cards that drove the loom. Someone had to build and maintain the loom. The productivity of fabric manufacture must have increased with the introduction of this invention. The same can be said for many other inventions. The development of production lines actually increased the availability of jobs and while also increasing productivity per capita.

What should be fairly obvious, on reflection, is that new jobs created by innovation often require new skills and some displaced workers may not be able to learn them. Even when there is a net increase in jobs resulting from innovation (think of the invention of the integrated circuit, the World Wide Web, YouTube), not everyone displaced will find new work unless or until they are able to learn new skills or apply new knowledge.

This need for new knowledge and skills applies very well to our field of computing and its applications. Programs are the equivalent of the cards in the Jacquard loom and the production of programs requires specific skills. If we have learned anything over the course of the computer’s development, it is that the designers of hardware and software must keep learning new knowledge and skills to be relevant and to be able to continue to undertake new work. While there was an intense increase in the need for COBOL programmers leading to Y2K in 2000, this skill is much less in demand now than, say, C++, JavaScript, Python, and Ruby on Rails (I know, I did not name your favorite language—feel free to send in your suggestions!).

What this suggests is that innovation and longer lives are driving the need for continuous learning. The old model of going to school for a while, having a career (maybe at the same company), and then retiring is being replaced with a more continuous need for access to new knowledge and skills throughout a career that may take many unpredictable twists and turns. In our own disciplines, the seeds of the solution may lie in the technology. The Internet, the World Wide Web, Massive Open Online Courses (MOOCs), and related infrastructure may supply some of the necessary educational needs we see emerging. Not every job will admit this form of education, of course, but we can see a society emerging in which learning becomes a lifelong necessity for substantial portions of the workforce.

Peter Diamandis’ book, Abundance,b paints a very optimistic view of the future and while one can be somewhat skeptical, it seems fair to say innovation has brought us the potential for abundance and new work. The role of computing in our society has increased dramatically in the past half century and it is my belief it will have a major and perhaps increasing influence on innovation in fields well outside of traditional computing and programming for the simple reason that computing tools are becoming essential to or at least involved in almost everything we do.

Back to Top

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More