Hal Berghel manages to write several pages about "The Discipline of Internet Forensics" (Digital Village, Aug. 2003) without ever addressing the purpose of forensic science as a whole. Fundamentally, it is science in the service of the law, and its main output is evidence sufficiently robust to withstand scrutiny in the courts.
Such output must be admissible, fully testable, and, in a jury trial, persuasive to jurors. Any tools deployed must be more than aids to network engineering (or to hacking) but must be rigorously tested and create unambiguous, uncontaminated output.
What makes any branch of forensic computing, under whatever label, different from most other aspects of forensic science is the extraordinary rate of change with which it must cope. Techniques for the analysis of paint, blood, and bullets progress relatively slowly and can be subject to the comparatively leisurely pace of peer review. But new hardware, operating systems, protocols, and applications all appear so rapidly within IT—and are then widely put to commercial and private use —that conventional peer review and its associated testing may occur only at a speed slower than the evolution of new criminal modus operandi.
In the U.S. courts, novel scientific and technical evidence is a matter of admissibility, with the judge acting as gatekeeper to safeguard the jury from being "confused" by competing experts (Daubert v. Merrell Dow Pharmaceuticals, Inc., 1993, 509 U.S. 579). Other important aspects of admissibility may include the means by which evidence was acquired—whether the appropriate warrants were issued or a crime was committed during such acquisition. Most European countries also mix in data protection and human rights issues.
The law enforcement impetus for computer forensics is no accident, as the agencies inevitably are the biggest customers. Their needs and experience often dominate the activities of those of us who work primarily in the courts rather than in academic lecture halls and seminar rooms.
Viewed through this lens, Berghel’s pleas for a "profession" to meet Peter Denning’s four "hallmarks" look rather peculiar, and his distinctions between "computer forensics" and "Internet forensics" positively dangerous. If Internet forensics in his words "deals with ephemeral" while computer forensics deals with physical things, addressing formal procedures, authentication, and chain-of-custody, it is difficult to see the point of his new profession other than as a vaguely amusing "art and mystery."
There is undoubtedly a need for academic contributions to all forms of forensic computing alongside the technicians of law enforcement and commercial vendors. ACM members might ask themselves how they would address some practical problems: reliably preserve the content of a remote Web site; reconstruct a Web page a given person must have seen, even if the page was dynamically created, and part of it came from a cache on the person’s own machine or one owned by his/her ISP; and turn the output of an intrusion-detection tool into robust, admissible evidence. Academia might also take a more distant and conceptual view of some of the broader issues of system design to collect evidence and the nature of "legal reliability."
Finally, Berghel might usefully inquire about the many forms of crime in which the various aspects of forensic computing are actually used. It is always interesting to analyze the skills of clever hackers, but meanwhile there are victims everywhere of e-commerce and Internet auction frauds, stalkers, copying of company IP, and child abuse on the Internet.
Peter Sommer
London, UK
Author Responds:
Peter Sommer seems to take the view that no study of "forensics" is legitimate if it does not conform to the traditional model of forensic science. While medical forensics and fingerprint identification may conform nicely, Internet forensics does not. That says as much about the narrow scope of "forensic science" as it does about the uniqueness of Internet forensics. The fact that not all of the fruits of Internet forensics have evidentiary value does not diminish its utility either as a law-enforcement tool or as an academic discipline. If we hold Internet forensics to the same evidentiary standard as medical forensics, we’ll never thwart another hacker.
As I write this (Aug. 11, 2003), the W32.Blaster/worm has just been unleashed into the world (www.cert.org/current/ current_activity.html#msrpcworm). It exploits a weakness in the Distributed Component Object Model interface to Microsoft’s Remote Procedure Call, infecting Windows 2000, XP, and 2003 Server with a variation of the dcom.c virus. According to the SANS Internet Storm Center (isc.sans.org), port 135 traffic increased 500% on the first day of W32.Blaster.
The following is a small sample of the information Internet forensics professionals have learned about the worm in those first 24 hours:
- Initial access is via TCP port 135 (one of the Netbios/SMB ports);
- Propagation involves TCP Port 4444 and UDP Port 69 (tftp);
- The binary, msblast.exe, is approximately 11k unpacked and 6k packed, and the MD5 hash on the packed file is 5ae700c1dffb00cef492844a4db6cd69;
- The infection sequence is (a) initial penetration via port 135, (b) deploy dcom.c exploit against RPC, creating a remote shell on port 4444; (c) execute native Microsoft tftp to download the packed worm file; (d) unpack; (e) add registry key to autostart; and (7) begin Distributed Denial of Service via port 80 SYN flood attack against windowsupdate.com on Aug. 16; and
- Code strings present in variants include:
msblast.exe
I just want to say LOVE YOU SAN!!
billy gates why do you make this possible ? Stop making money and fix your software!!
windowsupdate.com
start %s
tftp -i %s GET %s
%d.%d.%d.%d
%i.%i.%i.%i
BILLY
windows auto update
SOFTWARE\Microsoft\Windows\\CurrentVersion\Run
This is world-class forensics by any measure. Because of it, the damage done by W32.Blaster may be limited to the 150,000 computers initially infected. Within a few hours of detection, virus- and intrusion-detection rulebases and definitions were already updated. While none of this may be admissible in a court of law, the Internet forensics personnel involved can be content with the realization that they protected the global information infrastructure from collapse.
If their effort doesn’t measure up to Sommer’s standard of legitimate forensics, we may well want to question just how relevant his standard is to the modern information age.
Hal Berghel
Las Vegas, NV
What Software Engineering Can Do for Web Development
Regarding Robert L. Glass’s "A Mugwump’s-Eye View of Web Work" (Practical Programmer, Aug. 2003), some key ideas seem to have been omitted. First, the Web per se is just a universal user interface. So it does not change the paradigm very much from the classical PC user interface.
However, as a result of the Web—and the need for rapid application development—a paradigm-shift has occurred, in the form of application servers. They are pre-built three-tier systems, including Web-user interface, application server, and database server, that can be configured to address a particular problem.
With such a system, one does not get the full range of computational capabilities one would get with a system built from scratch. However, it means even only a few people can build a system in a matter of days or weeks.
A problem arises when the requirements are more than the application server was designed to address. The configuration is sometimes more difficult than building the system from scratch. And the documentation is often awful.
Meanwhile, although the industry is maturing toward using pre-designed building blocks, some things cannot be built with standard parts. We thus need to understand the standard parts, but also when an application requires something else. That’s the challenge for ACM and the academic community. Ultimately, the discipline may look more like civil engineering.
W. Terry Hardgrave
Bethesda, MD
Robert L. Glass asked important questions (Aug. 2003) about Web development, including: What can Web developers learn from the literature of software engineering?
The work my organization does best is not like traditional software development; in terms of time scales and production methods, we are more like a news organization. Nevertheless, our greatest opportunities for improvement involve acquiring the skills of software project management, software craftsmanship, and software process management. The literature of traditional software development has much to teach us.
Glass wrote that a lot of Web development today is done by inexperienced developers, often self-taught. The focus in hiring is usually on languages known, not software craftsmanship. Even more significant is that the organizations themselves are inexperienced. Along with quick turnover and layoffs, inexperience inhibits development of corporate memory.
A notable result is that quality-assurance departments are often small or nonexistent—despite the difficulties of developing Web applications due to the variety of clients, firewalls, proxies, and other factors.
In Web development, updating software is easy because it lives on only a few servers and clients. This is an important benefit but also involves special costs. When any given defect can be fixed at once, bug tracking can be neglected, thus denying the organization the fundamental source of data for quality management. There is more to version control than CVS. Release of changes could facilitate refactoring, but more often it facilitates bit rot in the code base. A combination of tight deadlines, inexperienced developers, and inexperienced organizations means that architectural decisions can be undermined quick.
Until recently, it was also difficult to test Web applications, complicating software process management, even preventing it. The newer tool kits, including HtmlUnit in Java and WWW::Mechanize in Perl, make automated testing much more feasible.
Moreover, Web server logs contain rich information about user behavior (very difficult to collect with traditional deployments). A few GB of logs may provide a complete record of how an application is being used.
These differences in Web development make possible extreme forms of staged delivery. I hope the software engineering community will yet help identify best practice for Web development projects.
Chris Morris
Manchester, UK
Robert L. Glass wrote (Aug. 2003) about two sides of a debate, concluding that both won. But I’ve learned that most debates, from nuclear disarmament to child rearing, can be summed up along the lines of the opening scene of Meredith Willson’s Broadway musical "The Music Man" where the debating traveling salesmen chant, "Ya can argue all ya wanna but it’s dif’rent than it was! No it ain’t! No it ain’t! But ya gotta know the territory!"
Jim Haynes
Fayetteville, AR
Join the Discussion (0)
Become a Member or Sign In to Post a Comment