This was originally posted on the blog Distant Whispers from an Academic Engineer's World.
What is the period we are going through now if not a global cataclysm? There is no way around it — this has caused untold hardship around the world and has changed the way we work and play. The destructive power of biology at its worst, a virus that is a true reflection of the Latin origin of the term, "poison." The question, rather narrow and perhaps inappropriate in a time of such great tribulation, is how is the development of technology, and its more basic cousin science, affected during such cataclysmic times.
There is no exact parallel to what we are seeing now, but perhaps some inexact parallels can be illuminating. One was the Second World War. Another going much farther back was the plague of Europe in the 17th century.
To me the invention of all inventions triggered by the events of World War II was the "Automatic Computing Engine," in other words, the electronic stored-program computer, and a precursor to the ubiquitous machines of today. Alan Turing, the colossus who has made the transition from geek fame to public fame, invented the Bombe and then the Colossus out of dire necessity — to break the German codes, which literally meant the difference between life and death. Little would the military funders of the research have thought of the computer making its way to every desk in our offices and homes. This was an invention born out of a classic case of a military military need but we are glad that that big push happened. It propelled us with amazing alacrity into the age of electronic computers.
After the war, to make these electronic machines more reliable, Bell Labs, a cauldron of innovation, kickstarted a large research effort to build the transistor, to replace the failure-prone vacuum tubes. Bardeen and Brattain, working in a group led by Shockley, announced the first transistor two years after the end of the War. This would be followed by the microchip, or as we commonly know it today, the integrated circuit. Again, the first major market for the microchip was the military — these could not be produced fast enough to go into our missiles, which we needed to stay ahead of our Cold War nemesis. They caused their price to go down dramatically — the average price of each microchip going into the Minuteman missile in 1962 was $50; by 1968 each cost $2.
Going back farther, much farther, consider events of 1665. Bubonic plague was ravaging London and beginning to spread to its suburbs. A young man in his 20's studying at Cambridge decided that the disease was getting too close for comfort and decided to head home about sixty miles away. In a year of forced break from studies, this young man had his annus mirabilis, the "year of wonders," as described lyrically in this cheery piece from the Washington Post. He came up with the theory of calculus during his forced social distancing and when he get got back to Cambridge after two years, theories in hand, within six months, he was made a fellow; two years later, a professor. A more recent story has cast doubts on some flourishes in the original story, but to me the substance stands — a time of deep reflection away from the prosaic details of daily livelihood allowed the imagination to soar. That young man, as you may have figured, was Isaac Newton. While few among us can lay claim to the level of intellectual preparedness and brilliance of mind, there is a happy inspiration lying for us in that event.
Credit: Ed Fisher, New Yorker Cartoons, Jan 26, 1963.
Historical trends that I could find go back to 1949. Two things become clear. There was a big upsurge in funding for Research and Development, or R&D (yes, the two are rather inextricably linked for statistical purposes) in the 1960s when there was more than 10% of the federal budget being spent on R&D (for context, in 2019 it was 2.8%). That upsurge is tied to strong geopolitical competition (translate "Cold War"), which came about through a long series of twists and turns after the World War. Second, a big push toward R&D funding has come from private sources — in fact, federal agencies spent less than 50% of the total spending on basic research in 2015, while that topped 70% in the 1960's and 70's. Again, nothing like a big competition for global domination to spur federal funding in R&D.
An optimistic part of me (and today that feels like most of me) says that federal investment will grow and private investment will grow. This will grow in the short-term since we want to win the race to the cure. The cure, whether a pill or a vaccine, will have a global customer base, falling over themselves to get to the store. So the prize for the winner, the country and the commercial enterprise, will be huge. The funding will also grow in the long-term because this event is so searing that we want to reduce the chances of it ever happening again … at least in the time horizon that we can imagine, i.e., our lifetimes.
What can we do now as we overcome our current challenges, which will benefit us later? Let us start with three, pushing several others to recesses of my brain for now.
To sum up, this forced change to our ways of work and play makes me grouchy. And the cost in terms of lives lost and livelihoods upended is incalculable. But there are lessons in history that, from a science and technology standpoint, we can come out on the right side. And these advancements can help society for the long haul. I will put this in a time capsule and open in 50 years to see where we ended up.
Saurabh Bagchi is a professor of Electrical and Computer Engineering and Computer Science at Purdue University, where he leads a university-wide center on resilience called CRISP. His research interests are in distributed systems and dependable computing, while he and his group have the most fun making and breaking large-scale usable software systems for the greater good.
No entries found