https://bit.ly/2yTFYCX May 29, 2020
The COVID-19 pandemic highlights the virus analogy that gave rise to the use of the word "virus" from biology, to label a malicious program that attacks computer systems. The situation moves us to look into that, as another way to compare nature and artifact, and as an excuse to raise more abstract questions. We are moved also to stipulate that our mastery of both the biological and computational forms is shallow, and to invite other, better observations to follow. See Apvrille and Guillaume1 for greater depth and intriguing crossover speculation, Weis11 for yet more intriguing comparison, and Wenliang Du's website for detailed virus examples,3 which constitute dramatic reading for coders.
A virus is generally not regarded as a living organism, but sometimes described as (similar to) software. When the first self-replicating computer programs made the rounds, they were experiments or pranks;12 for most, the point was solely reproduction. An early computer worm was beneficent, but escaped control.2
We distinguish computer viruses from computer worms by the profligate scale of replication, viruses generating a broadcast of copies rather than a chain of copies. The obvious points of analogy across both types of virus include that viruses are tiny, invading a host much greater in size and complexity, without an overt signal, and that viruses disrupt some process in the host. Neither computer nor biological virus necessarily does damage. In biology, self-replication is an end, not a means, making the damage a side-effect. In the modern computer virus, the end is likely to be the action of a payload of malicious code. Now the term "virus," in both environments, connotes an intrusive and damaging force carrying dangerous baggage.
To explore some points of analogy systematically, consider access: How is virus entry accomplished? Computer viruses look for an opening by probing known vulnerabilities; if one is found, malificent code is injected. This is quite like the organic version.
Consider gain: What does the virus get out of this, and how? The virus gets more virus, and the means of reproduction is the same—self-replication. Note the correspondence to the Unix system fork
() call, which spawns a new process by replicating the current process. The history tells us that this happened because it was easy: "… it seems reasonable to suppose that it exists in Unix mainly because of the ease with which fork could be implemented without changing much else."8 The heuristic, across both types: To start a new working structure, just copy the working structure on hand.
Consider pathology, the means of damage. A virus damages the host body by depleting cell resources, consumed by the virus; bursting the cell walls; or generating toxic byproducts. Do each of these have an analogy? Sure—Denial of Service; breaching buffer boundaries or reverse shell; interference with the operating system, degrading its protection of system resources such as CPU cycles, files, and ports.2,10
We could consider defense, the host's prevention or cure mechanism, that is, the action taken if the host somehow notices that something is wrong. That panoply of fascinating mechanisms is beyond our expertise, but it is clear vaccination is one of them, leading to countermeasures such as mutation. Both organic and computer viruses can mutate quickly. But mutation in organics is a quirk, a random unguided alteration. Mutation in computer programs is human-directed. The brute-force options for repair and defense are off limits to humans. We can't reboot to reset memory, let alone re-install a clean operating system.
Viruses have been described as troops in a war game, initiating and reacting, as they take over cells for the purpose of replication. But wait—Is there a purpose? All we can say for sure is that viruses insert genetic material into cells, which causes the cells to generate more viruses. Is there a struggle? Is control being deliberately wrested from the cell, or is there actually no agent involved that gives a hoot, no intention at all? The vocabulary of aggression in cell science (layperson's version) reflects our human phenomenology, projected onto what we see. It may be fair, or it may be distorted. It may be way off the mark; the cells might be "fulfilled"—an odd thought. But why is it less odd to say the cells are "defeated"? Why use the language of attack, when the language of hospitality (or indifference) might model the process just as well (the language of indifference, even better)? Previously, we said that in biology, damage is a "side-effect," which assumes some kind of intention. We now question that assumption. Other natural forces bring about change; the wind threatens, intrudes, and damages, but to speak of its intention is only poetic.
In computing, similarly, a computer virus executes in order to create more copies of its code and then disseminate them. Does that statement of the analogy, through the phrase "in order to," lead us to the attribution of volition to the computer virus, inaccurately? We claim it is misleading to speak as if the organic virus has volition. Imbued by the programmer, however, a computer virus exhibits hostility. But wait. That means the computer virus is more like the organic virus than the organic virus itself!
"The really interesting question is what a strong successful analogy, matching computer viruses to organic viruses, would mean."
Of course, the question of volition, seen here on a small scale, bears on larger questions in the philosophy of computing as well, those in artificial intelligence and cognitive science connected to intentionality and consciousness. That inquiry could be aided by a new locution for computer virus, which might even inform a new locution for organic virus. My earlier article "Articulation of Responsibility"7 called for such locutions.
Programs do not make decisions. Because it looks like they do, we need a way to talk about what they actually do that is not misleading. Viruses do not "intend" in any meaningful way; they just behave as if they were intending. Or perhaps they don't even "behave" in any particular way, they just exhibit actions that intentional beings would exhibit if they had as a goal the end-result reached by the virus. We are so dependent on the vocabulary of intention and volition that we have no other non-awkward options.
Analogies between natural and computation phenomena, tight or stretched, have formed the subjects of several pieces in this space.4,5,6 In the case before our eyes, we see the analogy between the biological virus and the computer virus exhibits strengths and weaknesses, and may offer further possibilities. Points of positive similarity may not be due to cause and effect, but rather to effects of some common cause, something like the general vulnerability of processes that use input and output. We might even propose the proper analogue to the biological microbe is the programmer-code pair, a self-contained system that lies between the program and the programmer, enjoying some kind of collective semi-animate agency. We can turn to philosophy to ask—Do agents have to be individual and human? That's debatable9 beyond the scope of this inquiry.
But wait. The really interesting question is what a strong successful analogy, matching computer viruses to organic viruses, would mean. Does it mean that some common notions—say, the general vulnerability of input (as mentioned previously), or entry through a defined interface, or subversion of a external body's resources—are somehow universal? If so, have we gained anything beyond a pleasant self-validation? But wait! What does validation get us, anyway? Are computer scientists to congratulate ourselves when our artifacts look like nature? What's so great about that? Or is there something great about that? If so, what's not so great about artifice?
Join the Discussion (0)
Become a Member or Sign In to Post a Comment