News
Architecture and Hardware

Fending Off Digital Sabotage

Posted
Managing the controls of an industrial system.
To defend against attacks on industrial machinery and systems, industry has mainly been focused on bolstering the digital side of its defenses.

Sometime between December 2009 and January 2010, a computer worm dubbed Stuxnet took charge of, and shook to pieces, 1,000 uranium-enrichment centrifuges in an Iranian nuclear facility—with a series of overspeed and underspeed signals fired into them by a breached supervisory control and data acquisition (SCADA) system.

That piece of spectacular nation-state sabotage brought into sharp relief the risks critical infrastructure and heavy industrial plants could face from precision malware built using a combination of zero-day vulnerabilities and deep insider knowledge: attackers need only breach the traditional cybersecurity defenses of such facilities to wreak havoc.

Since Stuxnet's wakeup call, industrial security engineers have been developing a battery of techniques designed to defend industrial equipment, and the products some of it is used to manufacture, from attacks in which malware is used to destroy physical machinery; so-called "cyberphysical" attacks.

One engineering team  has developed a technique in which critical infrastructure—such as the steam turbines generating electricity in power stations, or the municipal pumps feeding domestic water supplies—uses sacrificial mechanical parts that fail under attack conditions like overspeed, allowing the machine to fail safe without being wrecked.

Another research group has worked out how to sense whether carbon composite parts in airliners and wind turbines may have been fatally weakened by a cyberphysical attacker during manufacture, while yet another team has worked out the telltale signals that might reveal three-dimensionally-printed safety-critical components have been sabotaged, too.

Wormed by a mole

All these developments have been revealed in a series of recently published research papers, and all cite the way Stuxnet worked as the chief spark for their work. Why is that?

Stuxnet was a devious but ingenious computer worm that was said to be a joint project of U.S. and Israeli intelligence, commissioned by U.S. President George W. Bush and later continued by the Obama administration. The worm was allegedly snuck, via a simple USB stick, into the air-gapped nuclear fuel facility in Natanz, Iran, where it set about invading PCs and programmable logic controllers, repeatedly accelerating and then slowing the rotation of the centrifuges used to enrich the gas uranium hexafluoride. How that USB stick got into the facility is a matter of some conjecture: some reports say it was simply left in cafes in the locality of the plant in a social engineering attack, but more recently a mole working for Dutch intelligence in Natanz, Iran, has been credited with the USB injection part of the attack. 

Using no less than four zero-day Windows vulnerabilities, Stuxnet caused those centrifuges (whose drums spin at supersonic speeds) to endure excessive torques that led to 1,000 of the machines shattering, an event witnessed by astonished inspectors at the United Nations' International Atomic Energy Agency in Vienna, who saw the wreckage being carted away while monitoring the plant on CCTV.

To defend against such attacks on machinery, since Stuxnet, industry has mainly been focused on bolstering the digital side of its defenses. "Most of the employed methods now are on the cybersecurity side, hardening the manufacturing floor against cyberattacks, and working out how to detect malicious and anomalous behavior in cyberspace," says Mark Yampolskiy, a computer scientist specializing in advanced manufacturing systems at Auburn University in Alabama.

However, he says, given the financial and personnel resources that "state actors" can assign to developing sophisticated attacks (Symantec estimated Stuxnet's complexity means it could have taken 30 programmers six months to develop), organizations cannot just sit idly in the belief their digital defenses cannot be breached. "Such actors have proven capable of bypassing even the most sophisticated cybersecurity measures," says Yampolskiy. "Consequently, one or more extra lines of defense are needed."

Digital defense is not enough

Others agree that cyberdefense alone is no longer sufficient. At Ben-Gurion University of the Negev (BGU) in Beer Sheva, Israel, mechanical engineer Jacob Bortman says it has taken the decade since Stuxnet for organizations "to realize that the conventional protection methods are not enough." They need "out of the box" thinking, he says, to develop new methods of protection for critical equipment.

Research teams including Bortman, Yampolskiy, and their colleagues have revealed a clutch of novel physical protection measures that could be used alongside cyberdefenses to protect machinery from suffering the same the fate as Iran's uranium centrifuges.

At BGU, a team led by Dmitri Gazizulin and Bortman have taken the notion of an electrical fuse and applied a mechanical analog of it to rotating machinery, in the form of what they call a "fuse bearing." In a paper in the International Journal of Critical Infrastructure Protection published in June, they revealed that, just as an electrical fuse burns out to protect a machine from overcurrent, their bearing is designed to fail before a malware attack can force a critical rotary system's drive/drum/camshaft/axle into an overspeed or too-high-vibration condition.

Gazizulin explains the fuse bearing must not be not part of the critical rotating drive itself but simply be mechanically connected to it, running alongside it at the same speed/vibration levels. That way, no malware can affect it.

The fuse bearing will have a specially designed defect (a crack, basically) built into the inner surface of its outer ball bearing race such that a certain level of speed or vibration smashes the bearing. This bearing breakage is sensed and used to shut down the main machine safely before it can suffer any expensive damage. "The fuse bearing should fail, and detection algorithms we have developed should detect the failure and stop the critical system from deterioration," says Gazizulin.

In a uranium centrifuge, for instance, BGU's fuse bearing would have a defect designed to break before the drum torque got too high, allowing an orderly, safe shutdown to be instituted. Although today's critical machinery often has protective sensors to shut it down, says Gazizulin, these "integrated protective layers can be bypassed, or attacked" by malware, which is not possible with the purely mechanical fuse bearing. "Our mechanical protective layer is not connected to the main controlling system, so it cannot be bypassed or attacked in a conventional way," he says.

At Auburn, Yampolskiy agrees such a mechanical fuse "would be immune" to cyberattacks. "However, modern manufacturing systems are not limited to mechanical components. Just think about machines that use lasers or electron beams for cutting or melting of material; these would require different kinds of 'fuses'," he cautions.

Sabotaging carbon fiber structures

Meanwhile, for the person in the street, Yampolskiy and his colleague Kuang-Ting Hsiao at the University of South Alabama in Mobile are addressing what is perhaps a far scarier prospect: the chance that their holiday jet might be sabotaged to fall apart in midair.

In another paper in that same Critical Infrastructure Journal, the researchers reveal how they have worked out an "optimal sabotage attack" on the automated equipment used to produce carbon-fiber-reinforced plastic (CFRP) composites, the latter-day material of choice for load-bearing aircraft parts such as wings, control surfaces (flaps, ailerons), and horizontal and vertical stabilizers, and for wind turbine blades. As examples, the Boeing 787 aircraft is 50% CFRP by weight; the Airbus A350 XWB is 53% CFRP.

CFRP is particularly useful because it can be made stronger in the directions in which it really needs to be toughest, by choosing the direction in which neighboring sheets of carbon fibers are made to crisscross each other when they are stacked and bonded together in an epoxy matrix (called a laminate) by automated tools.

In their paper, Yampolskiy and Hsiao posit an attacker could use malware to change the directions of various fiber "plies," and so invisibly degrade the strength of manufactured parts.

Gaming aviation's holy grail

It is holy writ in aviation that all load-bearing structures must withstand 1.5 times any force the aircraft is ever likely to meet in flight, whether that force derives from gs pulled in clear air turbulence, say, or in a sudden, heavy hail shower. Yampolskiy and Hsiao have shown how an attacker could make small, invisible manipulations that reduce that safety factor from 1.5 down to 1.0, 0.9, or 0.8,  where they are way more likely to fail in service. They found the necessary ply direction deviations to achieve this can be "astonishingly small," between 11° and 15°.

A determined attacker with inside knowledge could take the safety factor lower still. "An expert in the field can push the safety factor as low as possible if the part's quality assurance procedure is known and they devise a precise plan that will not alert the QA team," says Hsiao.

On the (slightly) cheerier side, the Alabama and Auburn researchers have suggested a way in which QA teams might be able to spot digitally sabotaged manufactured parts: by their resonant frequency, when they are subjected to vibration. Fiber ply directions in the laminate affect its stiffness, and so its response to vibration, so a known good part will have a specific resonant frequency. The team's tests show the resonant frequency will vary by as much as 10% if sabotage is attempted.

As machine intelligence advances, Hsiao grows less optimistic about detecting such sabotage, since "It might still be possible for an attacker to devise a lamination design that satisfies the test, with a part that still has hidden weak points that go undetected; especially if AI is being used to assist in optimizing the sabotaged designs to pass the tests."

Current affairs

While that work on second-guessing composite impairment attacks and their detection is ongoing, yet another sabotage detection project Yampolskiy is involved in—with BGU and South Alabama—takes the work further into the realm of Industry 4.0. The subject of that project is additive manufacturing (AM),  the professional version of three-dimensional (3D) printing in which even heavy metal parts can be made using $100,000 (and more) printers.

Because some of the parts made with AM technology are safety-critical—such as fuel injectors for jet engines, medical implants, and rocket nozzles for spacecraft—it is seen as vital that ways to sense ongoing sabotage are developed. The researchers believe they have found one: by measuring the electric current delivered to each of the printer motors, they have found they can get a current signature for a known good part and reliably detect 3D print runs which deviate from it.

This is something of a rare breakthrough in the emerging field of cyberphysical systems security, one that needs to be repeated manyfold if Industry 4.0 is to thrive. Yet that will require a fresh focus, says Yampolskiy: "The majority of people who talk about security still focus on cybersecurity only. But we are dealing with systems that manufacture physical objects, so we have to be worried about cyberphysical attacks, many of which can simply bypass all cybersecurity-only measures."

"Security is considered to be something that can be attached later. Unfortunately, we have seen how this worked out in other areas."

Paul Marks is a technology journalist, writer, and editor based in London, U.K.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More