BLOG@CACM
Computing Profession

‘On Expert Testimony’: A Response

Posted
Own VP of Cybersecurity Strategy & Product Development Eoghan Casey

This post was created in response to the Cerf's Up article "On Expert Testimony" that appeared in the December 2023 issue of Communications.

In October, I had the honor of participating with Vinton Cerf in the colloquium on expert testimony sponsored by the U.S. National Academy of Sciences (NAS) and the U.K. Royal Society. I was invited to present challenges and approaches to evaluating digital evidence. This is a topic of concern because courts increasingly are encountering expert witnesses who present digital evidence as fact, when actually they are rendering an opinion without explicating their evaluation process. By evaluation I mean a structured approach to assigning a value to the observed evidence in light of at least two competing mutually exclusive claims. Expert testimony that does not express evaluative opinions in a clear, complete, correct and consistent manner can cause courts to make wrong decisions and to lose confidence in digital evidence.

My training as an engineer and computer scientist provided a solid foundation for understanding technical aspects of digital evidence. Technical knowledge is necessary to address digital forensic questions, but is not sufficient. It took years of case work to learn about the wide variety of errors, uncertainties, misinterpretations, and methods of tampering that arise in real-world situations. Errors occur in the data itself, and can also be introduced by tools used to examine digital evidence. Uncertainties exist in the location of a device, the time of an event, the recovery of data, the origin of a connection, the user of an account, and even the account associated with an activity. Tampering of digital evidence takes many forms, including destruction, forgery, backdating, and concealment, which can cause even experienced practitioners to miss or misinterpret important information.

Given these realities, it is crucial not to treat interpretation of digital evidence as merely an engineering activity. Understanding the theoretical and technical aspects of technology is important, but following a structured approach to evaluation of digital evidence also is essential to account for alternative explanations and real-world influences.

The major difference between expert testimony in the U.S. and the U.K. that Vinton Cerf highlighted is codified in the U.K. Forensic Science Regulator "Codes of Practice and Conduct for Development of Evaluative Opinions" (FSR-C-118). These practices follow the generally accepted Case Assessment and Interpretation (CAI) model which formalizes the evaluation of evidence to promote balance, logic, robustness, and transparency in expert testimony. This U.K. standard provides a verbal scale alongside order of magnitude values to use when evaluation is based on a combination of expert judgment and some limited datasets, as sometimes is the case with digital evidence. For example, in World Anti-Doping Agency v. Russian Anti-Doping Agency (CAS 2020/O/6689), verbal expressions of evaluation were used, including "The observed digital traces are extremely more probable given the proposition that a specialized secure erase tool was used to secure erase files on LIMS system in January 2019, rather than the proposition that 'it is not possible to determine more precisely the time of use of this program'."

The CAI model aligns with the approach adopted by the international community for the standard on forensic science interpretation (ISO/CD 21043-4.3). Although the U.S. Department of Justice has published "Uniform Language for Testimony and Reports" guidance documents for multiple forensic disciplines, they do not follow the CAI approach and do not cover digital evidence.

Formulating and expressing expert opinions is one of the most important, challenging, and fulfilling duties in my career. For ACM members motivated to provide expert testimony, take care not to treat engineering theory as a completely accurate reflection of reality, make sure to perform experiments to understand uncertainties, and follow a structured approach to evaluating digital evidence.

 

Eoghan Casey is VP of Cybersecurity Strategy & Product Development at Own (formerly OwnBackup), is on the Board of Directors of the DFRWS.org conference, and is an internationally recognized expert in digital forensic investigation and cyber risk mitigation.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More