Opinion
Computing Applications From the president

‘But Officer, I Was Only Programming at 100 Lines Per Hour!’

Posted
  1. Article
Vinton G. Cerf
ACM Past President and Google Inc. Vice President and Chief Internet Evangelist Vinton G. Cerf

It’s summer and I thought it might be worth stirring up some thunder and lightning with another controversial topic. As many readers know, there is a category of engineer that is sometimes called "Registered" or "Professional." This is not to say that most engineers, registered or not, are not professionals, but it turns out the terms are specifically and legally meaningful. While the procedures and practices vary from state to state in the U.S. (and elsewhere), there is often an exam to pass or at least a questionnaire to fill out and a registration step of some kind that authorizes the engineer to label herself or himself as "Professional."

Among the privileges granted to professional engineers is the right to undertake certain kinds of projects, to certify designs, to oversee design and implementation, and so on. These projects are often large and potentially involved in what could be life-threatening conditions of use. Examples include designs for bridges, buildings, aircraft, roads, railways, levees, dams, and hydroelectric- and nuclear power-generation systems. Indeed, some medical devices may require some oversight and certification for use. The reason for this attention to qualification and certification is obvious: public safety is involved one way or another.

I am sure some of you will have figured out where I am going with this. I think we would all agree that software (and its underlying hardware) is increasingly a part of our daily lives in large and small measure. The programs that control the car engine, the controllers for appliances (like water heaters, heating and cooling systems, washers and dryers, ovens, and other fairly large-scale and power-consuming devices), and financial management and trading systems are potentially hazardous if they have errors that lead to excessive load, mechanical malfunction, unexpected behaviors, and so on.

One wonders whether legislative bodies and perhaps insurance companies will reach conclusions that suggest software designers and implementers should have imposed upon themselves similar regulatory requirements, authorizations, and certifications. I once made a living writing or designing software for IBM, MCI, and the USG, among others. Some of that software or at least its design was pretty widely used (for example, MCI Mail, the Internet, time-sharing services, among others). With the increasing incidence of hacking, denial-of-service attacks, penetration, malware infection, and botnet creation, one might be tempted to argue that software developers should bear more responsibility for the behavior of their software.

I think many of you would agree that a test or questionnaire is not likely to provide assurance that a "certified professional" programmer’s work is free of flaws. I am not even sure what sort of test could be proposed that would provide such assurance. But it is very tempting to imagine a balance between certification and liability is worth some consideration. How can we provide motivation to a well-intended programmer to reduce the risk to users of his or her software? What incentives would companies that create software or market it have to reduce liability by appointing a "certified software engineer" to take responsibility for the safety and reliability of the software?

Plainly one would expect that any kind of licensing would have to produce some kind of safe harbor for both the employer and the employee. Perhaps reduction of insurance premiums?

In the past, I have had no problem convincing myself that the idea of professional licensing along these lines is a dumb idea. But we do have certification for users or maintainers of certain kinds of software or equipment (for example, for Microsoft software, Cisco equipment). I take pride in believing that I, and many colleagues, see themselves as professional in spirit if not in name and that we strive to deliver reliable code. I also believe that no one really knows how to write absolutely bug-free code, especially if the input and operating state space is innumerably large. So accepting liability that the code won’t break or be broken is a pretty scary thought.

We are so dependent on an increasing number of programs, large and small, it is difficult to believe the software profession will escape some kind of deep accountability in the future. We many avoid this in the near term, but I am not so sure in the long term.

OK, I have my cast-iron three-piece suit in place…What do you think?

Vinton G. Cerf, ACM PRESIDENT

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More