An amicus brief was filed today (here Gissantaner Amicus FILED) in the Sixth Circuit in U.S. v Daniel Gissantaner, Case No. 19-2305, concluding complex DNA mixture evidence. This brief was submitted on behalf of a group of forensic evidence and forensic evidence legal scholars (I was one of those who signed).

The district court in Gissantaner held a comprehensive Daubert hearing on the issue of STRmix’s admissibility, and ultimately excluded the STRmix evidence. The district court focused on the Michigan State Police Lab’s failure to appropriately internally validate the software for the level of sample complexity involved in Mr. Gissantaner’s case. This amicus addresses the Daubert factors of testing and peer review.

Here is the summary of argument from the brief (citations omitted):

Amici focus on two Daubert factors—(1) testing and (2) peer review and publication—to explain what is required under Daubert and why the standard was not met here. Establishing a technique’s validity through testing is at the very heart of the scientific method; consequently, “testing” is paramount among the enumerated admissibility factors.

“Testing” has two dimensions: tests that establish a method’s foundational validity, and tests that demonstrate a particular laboratory is validly applying the method in particular circumstances. STRmix, commercial software used to interpret DNA profiles, may be a foundationally valid method when applied to samples of adequate quantity and quality. But when stretched beyond its capacity, or when applied by a lab that failed to properly establish its limits, STRmix is unreliable.

That is what the district court correctly found happened here, and this Court should not find an abuse of discretion.

STRmix was inadequately validated for use by the Michigan State Police laboratory (“MSP”). Its use here is particularly troubling because STRmix was applied to the most complex type of crime scene sample, subject to the greatest risk of faulty interpretation: namely, a sample with three or four contributors, in which the focus is on a contributor who (1) donated as little as seven percent of the total DNA and (2) accounts for as few as eight or nine cells (49 picograms) of the mixture. MSP failed to show that its instrumentation and procedures yield reliable results for similarly marginal samples, and thus the court properly excluded the results.

MSP cannot fall back on “peer review and publishing” to shore up its internal validation testing. Peer review is not a talismanic item to check off a list: it is a proxy for whether a technique has been subjected to sufficient scrutiny to reveal and correct flaws. The prong is not satisfied merely if peer-reviewed publications exist; rather, they must demonstrate that the method withstands unbiased, rigorous scientific scrutiny. Id. at 2105. Yet, the body of literature does not meaningfully support a finding that MSP’s particular application of STRmix here was reliable.