BLOG@CACM
Computing Applications BLOG@CACM

Securing the Future of Computer Science; Reconsidering Analog Computing

The Communications Web site, http://cacm.acm.org, features more than a dozen bloggers in the BLOG@CACM community. In each issue of Communications, we'll publish selected posts or excerpts.

twitter
Follow us on Twitter at http://twitter.com/blogCACM

http://cacm.acm.org/blogs/blog-cacm

Mark Guzdial sees hope in computer science education efforts in the U.K. Daniel Reed suggests we should not be so quick to discard analog computing.
Posted
  1. Mark Guzdial "The U.K. is Taking Steps to Improve Computing Education in Schools"
  2. Daniel Reed "Analog Computing: Time for a Comeback?"
  3. Analog History
  4. Hybrid Futures
  5. Readers' comments
  6. Authors
BLOG@CACM logo

http://cacm.acm.org/blogs/blog-cacm/132934-the-uk-is-taking-steps-to-improve-computing-education-in-schools/fulltext
September 28, 2011

Google’s CEO Eric Schmidt critiqued the lack of computing education in U.K. schools in a recent speech in Edinburgh: "I was flabbergasted to learn that today computer science isn’t even taught as standard in U.K. schools. Your IT curriculum focuses on teaching how to use software, but it doesn’t teach people how it’s made. It risks throwing away your great computing heritage."

Schmidt went on to lament the growing divergence between science and arts and called on educators to "reignite children’s passion for science, engineering, and math."

A recent issue of The Economist raised the question: "Where is Britain’s Bill Gates?" Two of ACM’s leaders in computing education, Eric Roberts of Stanford University and Andrew McGettrick of the University of Strathclyde, wrote a letter in reply, to help in understanding that question:

British universities produce too few graduates with the special software-development skills that drive the high end of the industry. Universities in Britain find it harder than their American counterparts to develop innovative teaching and curriculums because of national benchmarks that are often highly prescriptive. Such benchmarks force universities to rely on written exams to measure achievement, which can undermine the all-important spirit of innovation and creativity. Written exams are rarely the best measure of software expertise.

Speaking last year to students at Stanford, Mark Zuckerberg said that he likes hiring Stanford graduates because "they know how to build things." If British universities could focus more of their attention on teaching students to write applications at the leading edge of the technological revolution, the budding Bill Gateses of Britain would have an easier time of it.

Fortunately, computing educators in the U.K. can point to a couple of areas of real progress. The first is a recently announced effort to teach software development in U.K. schools. The new initiative is welcomed by the British Computer Society and is supported by IT companies like Microsoft, IBM, Cisco, and HP.

The second step may have even greater impact. A report by the The Royal Society, the world’s oldest scientific organization, on computing at schools is expected in the next few months. The new report is expected to call for increased computer science education in the primary and secondary grades, and it is expected to get some real attention coming from The Royal Society.

Meanwhile, in the U.S., we are still struggling to get significant computer science into the nation’s schools. It is a hard problem because the U.S. education system is so decentralized—literally, the primary and secondary schools are defined at 51 places (in each of the 50 states, plus Puerto Rico). The common core standards (a set of education standards coming from the nation’s governors, not from the federal government) have now been finalized. Unfortunately, computer science did not end up being part of those standards, despite the "Computing in the Core" coalition. While that was disappointing, a new bill was just introduced into the U.S. Congress to bolster K–12 computer science education in the U.S. Part of the new Computer Science Education Act is an effort to help each of the states develop computing education for their programs.

Many of us in the U.S. will be watching carefully the developments in the U.K. We will be eager to see the success of their efforts toward improving computing education, and then we will aim to apply the lessons learned here.

Back to Top

Daniel Reed "Analog Computing: Time for a Comeback?"

http://cacm.acm.org/blogs/blog-cacm/135154-analog-computing-time-for-a-comeback/fulltext
October 8, 2011

In the early days of the automobile, there was a lively competition among disparate technologies for hegemony as the motive power source. Steam engines were common, given their history in manufacturing and locomotives, and electric vehicles trundled through the streets of many cities. The supremacy of the internal combustion engine as the de facto power source was by no means an early certainty. Yet it triumphed due to a combination of range, reliability, cost and safety, relegating other technologies to historical curiosities.

Thus, it is ironic that we are now assiduously re-exploring several of these same alternative power sources to reduce carbon emissions and dependence on dwindling global petroleum reserves. Today’s hybrid and electric vehicles embody 21st century versions of some very old ideas.

There are certain parallels to the phylogenic recapitulation of the automobile now occurring in computing. Perhaps it is time to revisit some old ideas.

Back to Top

Analog History

Use of the word "computer" conjures certain images and brings certain assumptions. One of them, so deeply ingrained that we rarely question it, is that computing is digital and electronic. Yet there was a time not so long ago when those adjectives were neither readily assumed nor implied when discussing computing, just as the internal combustion engine was not de rigueur in automobile design.

The alternative to digital computing—analog computing—has a long and illustrious history. Its antecedents lie in every mechanical device built to solve some problem in a repeatable way, from the sundial to the astrolabe. Without doubt, analog computing found its apotheosis in the slide rule, which dominated science and engineering calculations for multiple centuries, coexisting and thriving alongside the latecomer, digital computing.

The attraction of analog computing has always been its ability to accommodate uncertainty and continuity. As Cantor showed, the real numbers are non-countably infinite, and their discretization in a floating-point representation is fraught with difficulty. Because of this, the IEEE floating-point standard is a delicate and ingenious balance between range and precision.

All experimental measurements have uncertainty, and quantifying that uncertainty and its propagation in digital computing models is part of the rich history of numerical analysis. Forward error propagation models, condition numbers, and stiffness are all attributes of this uncertainty and continuity.

Back to Top

Hybrid Futures

I raise the issue of analog computing because we face some deep and substantive challenges in wringing more performance from sequential execution and the von Neumann architecture model of digital computing. Multicore architectures, limits on chip power, near threshold voltage computation, functional heterogeneity and the rise of dark silicon are forcing us to confront fundamental design questions. Might analog computing and sub-threshold computing bring some new design flexibility and optimization opportunities?

We face an equally daunting set of challenges in scientific and technical computing at very large scale. For exascale computing, reliability, resilience, numerical stability and confidence can be problematic when input uncertainties can propagate, and single and multiple bit upsets can disturb numerical representations. How can we best assess the stability and error ranges on exascale computations? Could analog computing play a role?

Please note that I am not advocating a return to slide rules or pneumatic computing systems. Rather, I am suggesting we step back and remember that the evolution of technologies brings new opportunities to revisit old assumptions. Hybrid computing may be one possible way to address the challenges we face on the intersecting frontiers of device physics, computer architecture, and software.

A brave new world is aborning. Might there be a hybrid computer in your hybrid vehicle?

Back to Top

Readers’ comments

Digital computers have one fatal shortcoming—they use a clock (or trigger) to change from one state to another. During (or absent) a clock trigger, they are deaf, dumb, and blind. The more precision in time that is demanded, the more state cycles are forced. Analog devices operate deriving their end functions from input conditions without having to walk through the state changes to get there. If we can ever get over the fact that precision is not accuracy, analog systems may make a comeback. (Digital air data computers are one of the logical absurdities—aerodynamic data has no business going from analog physics through digital math, to analog output (control position) when none of the steps in between have any need of step-wise state change computation.

—James Byrd

It may not be known to this community, but some of us have been doing what Daniel Reed’s article suggests. A single-chip analog computer, that can solve differential equations up to 80th order, often faster than a digital computer, and without any convergence problems, was described a few years ago: see G. Cowan, R. Melville, and Y. Tsividis, "A VLSI analog computer/digital computer accelerator", IEEE Journal of Solid-State Circuits, vol. 41, no. 1 (January 2006), pp. 42-53. There is a lot more that can be done in this area.

—Yannis Tsividis, Columbia University

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More