Opinion
Computing Applications Inside risks

Trustworthy Systems Revisited

Posted
  1. Article
  2. Author

System trustworthiness is in essence a logical basis for confidence that a system will predictably satisfy its critical requirements, including information security, reliability, human safety, fault tolerance, and survivability in the face of wide ranges of adversities (such as malfunctions, deliberate attacks, and natural causes).

Our lives increasingly depend on critical national infrastructures that depend in varying degrees on the dependable behavior of computer-communication resources, including the Internet and many of its attached computer systems. Unless certain information system resources are trustworthy, our critical systems are at serious risk from failures and subversions. Unfortunately, for many of the key application domains, the existing information infrastructures are lacking in trustworthiness. For example, power grids, air-traffic control, high-integrity electronic voting systems, the emerging DoD Global Information Grid, the national infrastructures, and many collaborative and competitive Internet-based applications all need systems that are more trustworthy than we have today.

In this column, we have frequently considered risks associated with such systems and what is needed to make them more trustworthy. This month’s column takes a higher-level and more intuitive view by considering analogies with our natural environment—expectations for which are rather similar to expectations for trustworthy information systems. For example, pure air and uncontaminated water are vital, as are the social systems that ensure them.

Although poorly chosen analogies can be misleading, the analogy with our natural environment seems quite apt. Each of the following bulleted items is applicable to both trustworthy information systems and natural environments.

  • Their critical importance is generally underappreciated until something goes fundamentally wrong—after which undoing the damage can be very difficult if not impossible.
  • Problems can result from natural circumstances, equipment failures, human errors, malicious activity, or a combination of these and other factors.
  • Dangerous contaminants may emerge and propagate, often unobserved. Some of these may remain undetected for relatively long periods of time, whereas others can have immediately obvious consequences.
  • Your well-being may be dramatically impeded, but there is not much you as an individual can do about aspects that are pervasive—perhaps international or even global in scope.
  • Detection, remediation, and prevention require cooperative social efforts, such as public health and sanitation efforts, as well as technological means.
  • Up-front preventive measures can result in significant savings and increases in human well-being, ameliorating major problems later on.
  • Once something has gone recognizably wrong, countermeasures are typically fruitless—too little, too late.
  • As we noted in the June 2004 column, long-term thinking is relatively rare. There is frequently little governmental or institutional emphasis on prevention of bad consequences.
  • Many of the arguments against far-sighted planning and proactive remediation are skewed, being based on faulty, narrowly scoped, or short-sighted reasoning.
  • Commercial considerations tend to trump human well-being, with business models sometimes considering protection of public welfare to be detrimental to corporate and enterprise bottom lines.

In some contexts, pure water is becoming more expensive than oil. Fresh air is already a crucial commodity, especially for people with severe breathing and health problems. Short- and long-term effects of inadequately trustworthy information systems can be similarly severe. Proactive measures are as urgently needed for system trustworthiness as they are for breathable air, clean water, and environmental protection generally. It is very difficult to remediate computer-based systems that were not designed and implemented with trustworthiness in mind. It is also very difficult to remediate serious environmental damage.

Anticipating and responding to compelling long-term needs does not require extraordinary foresight, whether for air, water, reversing global warming, or trustworthy systems upon which to build our infrastructures. Our long-term well-being—perhaps even our survival—depends on our willingness to consider the future and to take appropriate actions.

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More